Apr 17 17:07:52.203470 ip-10-0-138-224 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:07:52.687853 ip-10-0-138-224 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:52.687853 ip-10-0-138-224 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:07:52.687853 ip-10-0-138-224 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:52.688395 ip-10-0-138-224 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:07:52.688395 ip-10-0-138-224 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:52.688606 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.688532 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:07:52.694084 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694069 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:52.694084 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694084 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694089 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694092 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694095 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694098 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694102 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694104 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694107 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694110 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694112 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694122 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694125 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694127 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694130 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694132 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694135 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694137 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694140 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694143 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694145 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:52.694146 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694148 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694151 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694154 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694157 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694160 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694164 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694168 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694171 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694174 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694176 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694179 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694182 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694184 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694187 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694189 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694192 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694194 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694197 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694200 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694202 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:52.694647 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694216 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694219 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694221 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694224 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694226 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694229 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694231 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694234 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694236 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694240 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694245 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694248 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694251 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694254 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694258 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694261 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694264 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694267 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694270 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694273 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:52.695169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694276 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694278 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694282 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694284 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694287 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694290 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694292 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694294 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694297 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694300 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694303 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694305 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694307 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694310 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694313 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694315 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694318 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694320 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694323 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:52.695662 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694325 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694328 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694331 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694333 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694336 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694339 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694702 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694707 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694710 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694712 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694715 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694718 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694721 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694724 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694726 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694729 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694732 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694735 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694737 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694740 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:52.696116 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694743 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694745 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694747 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694756 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694759 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694762 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694764 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694767 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694770 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694773 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694775 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694778 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694780 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694782 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694785 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694788 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694790 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694793 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694795 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:52.696596 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694799 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694802 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694805 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694808 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694810 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694813 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694815 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694818 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694820 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694825 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694828 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694831 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694834 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694837 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694841 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694843 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694846 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694849 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694852 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:52.697069 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694855 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694858 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694860 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694863 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694866 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694868 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694871 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694873 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694875 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694878 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694881 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694883 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694886 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694888 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694891 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694894 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694896 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694900 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694903 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694906 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:52.697583 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694908 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694911 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694913 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694916 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694918 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694921 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694923 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694926 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694928 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694930 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694933 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694936 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694938 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.694943 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695566 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695578 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695583 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695588 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695592 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695596 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695600 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:07:52.698075 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695605 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695608 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695611 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695614 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695618 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695621 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695624 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695627 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695630 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695633 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695635 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695638 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695643 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695646 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695649 2572 flags.go:64] FLAG: --config-dir="" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695652 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695655 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695659 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695662 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695665 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695669 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695672 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695675 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695678 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695683 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:07:52.698590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695686 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695690 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695693 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695696 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695699 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695702 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695705 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695709 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695712 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695715 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695718 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695721 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695725 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695728 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695731 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695734 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695737 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695740 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695743 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695745 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695748 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695751 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695754 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695758 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695761 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:07:52.699183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695764 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695767 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695771 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695774 2572 flags.go:64] FLAG: --help="false" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695776 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-224.ec2.internal" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695779 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695784 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695787 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695790 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695794 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695797 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695799 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695802 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695805 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695808 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695811 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695813 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695817 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695820 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695823 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695826 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695829 2572 flags.go:64] FLAG: --lock-file="" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695831 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695835 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:07:52.699804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695837 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695843 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695845 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695849 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695852 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695854 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695858 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695860 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695863 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695867 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695870 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695874 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695877 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695880 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695885 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695888 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695891 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695894 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695897 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695904 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695907 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695910 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695913 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:07:52.700404 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695916 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695922 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695925 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695928 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695931 2572 flags.go:64] FLAG: --port="10250" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695934 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695937 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07cbf664ad3a4672f" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695940 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695943 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695946 2572 flags.go:64] FLAG: --register-node="true" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695949 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695953 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695957 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695960 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695963 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695966 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695970 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695973 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695976 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695979 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695982 2572 flags.go:64] FLAG: --runonce="false" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695985 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695988 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695990 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695995 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.695998 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:07:52.700961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696001 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696004 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696007 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696010 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696013 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696015 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696018 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696022 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696025 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696029 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696034 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696037 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696040 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696044 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696047 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696049 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696052 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696055 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696058 2572 flags.go:64] FLAG: --v="2" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696062 2572 flags.go:64] FLAG: --version="false" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696066 2572 flags.go:64] FLAG: --vmodule="" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696070 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.696074 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696173 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696178 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:52.701610 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696182 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696185 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696188 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696191 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696195 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696199 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696202 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696220 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696225 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696229 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696233 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696236 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696239 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696242 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696244 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696247 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696250 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696252 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696255 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:52.702217 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696258 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696260 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696263 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696265 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696268 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696271 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696273 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696276 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696279 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696281 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696284 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696286 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696289 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696291 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696294 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696297 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696299 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696302 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696307 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696309 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:52.702878 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696313 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696317 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696319 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696322 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696324 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696327 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696331 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696334 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696337 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696340 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696342 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696345 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696347 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696350 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696353 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696355 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696358 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696360 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696363 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:52.703660 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696366 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696368 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696371 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696373 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696376 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696379 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696381 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696383 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696386 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696388 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696391 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696395 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696397 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696401 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696404 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696407 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696410 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696412 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696415 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696418 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:52.704138 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696420 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696423 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696426 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696429 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696431 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.696434 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:52.704640 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.697222 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:52.706163 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.706144 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:07:52.706216 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.706164 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706227 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706234 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706237 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706240 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706243 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706246 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706248 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:52.706250 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706252 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706255 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706258 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706260 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706263 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706265 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706268 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706271 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706273 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706276 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706279 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706281 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706285 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706288 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706291 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706294 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706297 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706299 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706302 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:52.706447 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706304 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706309 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706313 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706316 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706319 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706321 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706324 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706327 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706330 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706332 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706335 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706338 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706340 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706343 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706345 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706348 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706351 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706353 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706356 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:52.706974 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706359 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706361 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706364 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706367 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706370 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706372 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706375 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706377 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706380 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706382 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706385 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706387 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706390 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706392 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706395 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706397 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706400 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706403 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706405 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706408 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:52.707449 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706411 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706414 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706416 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706419 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706422 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706425 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706427 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706430 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706432 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706435 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706438 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706441 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706443 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706446 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706449 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706451 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706454 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706457 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706460 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706463 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:52.707937 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706465 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.706470 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706568 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706574 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706577 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706580 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706583 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706586 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706589 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706591 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706594 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706597 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706600 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706603 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706605 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:52.708466 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706608 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706610 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706613 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706616 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706618 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706621 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706624 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706627 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706629 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706633 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706635 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706638 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706640 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706643 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706645 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706648 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706651 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706653 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706656 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:52.708833 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706658 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706661 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706663 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706666 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706670 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706673 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706676 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706678 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706681 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706684 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706686 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706689 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706691 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706694 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706696 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706699 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706702 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706704 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706707 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706709 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:52.709393 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706712 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706714 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706717 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706720 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706723 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706725 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706728 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706730 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706733 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706735 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706738 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706740 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706742 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706745 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706747 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706750 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706752 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706755 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706757 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:52.709894 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706761 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706765 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706768 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706771 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706773 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706776 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706779 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706781 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706784 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706786 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706788 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706791 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706793 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706796 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:52.706798 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.706803 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:52.710379 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.707524 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:07:52.711679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.711665 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:07:52.712612 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.712592 2572 server.go:1019] "Starting client certificate rotation" Apr 17 17:07:52.712713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.712695 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:52.712748 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.712730 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:52.739525 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.739509 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:52.745223 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.745193 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:52.762142 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.761969 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:07:52.768357 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.768336 2572 log.go:25] "Validated CRI v1 image API" Apr 17 17:07:52.769628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.769614 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:07:52.771617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.771602 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:52.773980 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.773957 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c25b6b8d-de95-42a6-a00f-fcd301525911:/dev/nvme0n1p4 eabf2a7e-e8f8-4974-a636-2fab309e00f8:/dev/nvme0n1p3] Apr 17 17:07:52.774024 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.773981 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:07:52.779541 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.779442 2572 manager.go:217] Machine: {Timestamp:2026-04-17 17:07:52.777616258 +0000 UTC m=+0.445299902 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105092 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21e6e4da7280d41117d46745acb780 SystemUUID:ec21e6e4-da72-80d4-1117-d46745acb780 BootID:70b2d594-08fb-4434-bf70-f3f94e5fb6e5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:13:a0:5b:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:13:a0:5b:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:8f:be:24:9d:dc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:07:52.779541 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.779536 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:07:52.779638 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.779607 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:07:52.780000 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.779981 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:07:52.780123 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.780001 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-224.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:07:52.780166 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.780132 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:07:52.780166 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.780141 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:07:52.780166 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.780154 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:52.780264 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.780171 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:52.783962 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.783951 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:52.784201 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.784192 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:07:52.786777 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.786768 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:07:52.786810 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.786780 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:07:52.786810 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.786792 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:07:52.786810 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.786804 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:07:52.786893 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.786822 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:07:52.787843 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.787830 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:52.787890 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.787848 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:52.790679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.790664 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:07:52.792005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.791993 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:07:52.793977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.793963 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:07:52.794009 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.793987 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:07:52.794009 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.793997 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:07:52.794009 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794007 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794015 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794024 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794032 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794040 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794050 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794059 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:07:52.794090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794083 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:07:52.794271 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794096 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:07:52.794987 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.794973 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qfdlr" Apr 17 17:07:52.795866 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.795855 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:07:52.795904 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.795868 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:07:52.799492 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.799480 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:07:52.799537 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.799520 2572 server.go:1295] "Started kubelet" Apr 17 17:07:52.799611 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.799589 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:07:52.799661 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.799619 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:07:52.799699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.799675 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:07:52.800387 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.800357 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-224.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:07:52.800516 ip-10-0-138-224 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:07:52.800624 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.800517 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:07:52.800624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.800580 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-224.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:07:52.801396 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.801260 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:07:52.802807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.802794 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:07:52.803407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.803391 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qfdlr" Apr 17 17:07:52.808646 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.808630 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:07:52.808795 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.807715 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-224.ec2.internal.18a733ed69fe58a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-224.ec2.internal,UID:ip-10-0-138-224.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-224.ec2.internal,},FirstTimestamp:2026-04-17 17:07:52.799492264 +0000 UTC m=+0.467175908,LastTimestamp:2026-04-17 17:07:52.799492264 +0000 UTC m=+0.467175908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-224.ec2.internal,}" Apr 17 17:07:52.809112 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.809097 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:07:52.809202 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.809114 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:52.810926 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.810773 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:07:52.811004 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.810932 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:07:52.811090 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.811064 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:52.811321 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.811307 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:07:52.811403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.811325 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:07:52.811571 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.811489 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:07:52.812379 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812324 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:07:52.812477 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812381 2572 factory.go:55] Registering systemd factory Apr 17 17:07:52.812477 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812393 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:07:52.812660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812642 2572 factory.go:153] Registering CRI-O factory Apr 17 17:07:52.812660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812657 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 17:07:52.812758 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812709 2572 factory.go:103] Registering Raw factory Apr 17 17:07:52.812758 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.812723 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 17:07:52.814563 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.814547 2572 manager.go:319] Starting recovery of all containers Apr 17 17:07:52.815578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.815559 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:52.818719 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.818698 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-224.ec2.internal\" not found" node="ip-10-0-138-224.ec2.internal" Apr 17 17:07:52.825056 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.824952 2572 manager.go:324] Recovery completed Apr 17 17:07:52.829048 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.829034 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:52.831722 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.831707 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:52.831797 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.831740 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:52.831797 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.831762 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:52.832284 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.832267 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:07:52.832284 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.832279 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:07:52.832390 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.832295 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:52.834626 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.834615 2572 policy_none.go:49] "None policy: Start" Apr 17 17:07:52.834660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.834630 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:07:52.834660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.834639 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:07:52.869815 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.869801 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.869835 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.869845 2572 server.go:85] "Starting device plugin registration server" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.870069 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.870080 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.870160 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.870252 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.870259 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.870898 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:07:52.883841 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.870929 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:52.943844 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.943792 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:07:52.944940 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.944921 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:07:52.945011 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.944944 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:07:52.945011 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.944959 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:07:52.945011 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.944965 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:07:52.945011 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.944993 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:07:52.947719 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.947696 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:52.971046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.971030 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:52.971939 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.971922 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:52.972021 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.971953 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:52.972021 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.971963 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:52.972021 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.971993 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-224.ec2.internal" Apr 17 17:07:52.980485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:52.980472 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-224.ec2.internal" Apr 17 17:07:52.980534 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:52.980492 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-224.ec2.internal\": node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.005045 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.005031 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.045480 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.045461 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal"] Apr 17 17:07:53.045532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.045527 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:53.046258 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.046237 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:53.046346 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.046266 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:53.046346 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.046280 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:53.047697 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.047683 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:53.047822 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.047808 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.047901 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.047836 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:53.048364 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048351 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:53.048364 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048358 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:53.048491 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048379 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:53.048491 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048383 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:53.048491 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048393 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:53.048491 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.048396 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:53.050925 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.050908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.051031 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.050936 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:53.051671 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.051656 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:53.051747 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.051677 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:53.051747 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.051692 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:53.078772 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.078754 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-224.ec2.internal\" not found" node="ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.083074 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.083061 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-224.ec2.internal\" not found" node="ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.105294 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.105279 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.113637 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.113619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03b99da47aa26d733084fbd12fd690dc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-224.ec2.internal\" (UID: \"03b99da47aa26d733084fbd12fd690dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.113698 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.113642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.113698 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.113660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.205602 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.205550 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.214026 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03b99da47aa26d733084fbd12fd690dc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-224.ec2.internal\" (UID: \"03b99da47aa26d733084fbd12fd690dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.214109 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.214173 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03b99da47aa26d733084fbd12fd690dc-config\") pod \"kube-apiserver-proxy-ip-10-0-138-224.ec2.internal\" (UID: \"03b99da47aa26d733084fbd12fd690dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.214173 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.214173 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.214310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.214217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d4ee3205b15c3555a6b381bfd25f947-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal\" (UID: \"5d4ee3205b15c3555a6b381bfd25f947\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.306266 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.306246 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.380854 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.380834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.386244 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.386227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.406919 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.406896 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.507527 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.507474 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.608140 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.608116 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-224.ec2.internal\" not found" Apr 17 17:07:53.663233 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.663199 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:53.705056 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.705037 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:53.709286 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.709269 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.712328 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712315 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:07:53.712429 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712415 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:53.712472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712440 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:53.712472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712447 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:53.712539 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712472 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:53.712539 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.712488 2572 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ac80004889c49441e93e336808239913-5e455a0c30e4a692.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.138.224:46066->54.173.152.190:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.712539 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.712516 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" Apr 17 17:07:53.730127 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.730110 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:07:53.787762 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.787710 2572 apiserver.go:52] "Watching apiserver" Apr 17 17:07:53.793904 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.793886 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:07:53.794245 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.794228 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-h6fpt","openshift-network-diagnostics/network-check-target-rqtb2","openshift-network-operator/iptables-alerter-vk4s9","kube-system/konnectivity-agent-vf59c","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9","openshift-multus/multus-additional-cni-plugins-kn4sv","openshift-multus/multus-cjws9","openshift-ovn-kubernetes/ovnkube-node-8ltqc","kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal","openshift-cluster-node-tuning-operator/tuned-s8v7c","openshift-dns/node-resolver-rxccf","openshift-image-registry/node-ca-vbgg6"] Apr 17 17:07:53.795688 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.795674 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:53.795756 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.795733 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:07:53.798079 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.798064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.799383 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.799362 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.799505 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.799456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.799505 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.799437 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:07:53.800430 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.800414 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.800519 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.800417 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:07:53.800579 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.800513 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s2gp7\"" Apr 17 17:07:53.800579 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.800421 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.801047 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.801031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.801491 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.801474 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:07:53.801596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.801517 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdltp\"" Apr 17 17:07:53.801658 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.801607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:07:53.802504 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.802482 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.803405 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.803388 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:07:53.803770 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.803752 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.804024 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.803921 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.804362 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.804339 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vzwnp\"" Apr 17 17:07:53.805135 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.804499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.805135 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.804871 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:07:53.805135 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.805030 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.805135 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.805083 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:07:53.805470 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.805435 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2br7m\"" Apr 17 17:07:53.806566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.806059 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.806566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.806301 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:07:53.806707 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.806570 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:02:52 +0000 UTC" deadline="2028-01-10 04:45:12.323385938 +0000 UTC" Apr 17 17:07:53.806707 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.806591 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15179h37m18.516798067s" Apr 17 17:07:53.808189 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.808062 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9qw6x\"" Apr 17 17:07:53.808189 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.808073 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:07:53.809069 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.809048 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.809157 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.809097 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.809413 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.809396 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:53.810457 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.810423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.811554 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811534 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.811679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:07:53.811679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811606 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:07:53.811679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811631 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.811679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.811941 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811795 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.811941 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811831 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:07:53.811941 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.811831 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kln2x\"" Apr 17 17:07:53.812306 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812114 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kcdpt\"" Apr 17 17:07:53.812306 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812225 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.812625 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812534 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:07:53.812625 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812601 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.812625 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812622 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mnfwx\"" Apr 17 17:07:53.812793 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812634 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.812922 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.812905 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:07:53.813939 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.813920 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:07:53.814406 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.814390 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:07:53.814406 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.814399 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2thgw\"" Apr 17 17:07:53.814542 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.814403 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:07:53.817289 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5d24e31e-3394-41c1-a9b3-f9761295236c-iptables-alerter-script\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.817364 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.817364 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-var-lib-kubelet\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.817485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-tmp-dir\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.817485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-etc-selinux\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.817485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv2jq\" (UniqueName: \"kubernetes.io/projected/44929b09-362a-4e1a-aa6b-88795d3cd03c-kube-api-access-fv2jq\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.817485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-system-cni-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.817485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-etc-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-bin\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-config\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-env-overrides\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rr6v\" (UniqueName: \"kubernetes.io/projected/5d24e31e-3394-41c1-a9b3-f9761295236c-kube-api-access-5rr6v\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-node-log\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysconfig\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-kubernetes\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842hx\" (UniqueName: \"kubernetes.io/projected/e2ff083b-5e25-4ad5-9ebe-7d015658c212-kube-api-access-842hx\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.817759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-device-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-netns\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-ovn\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-cnibin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-conf-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-konnectivity-ca\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8sck\" (UniqueName: \"kubernetes.io/projected/e3e9c005-8254-4300-8c36-63018e536c0f-kube-api-access-j8sck\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-tuned\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-system-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.817965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-bin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2ff083b-5e25-4ad5-9ebe-7d015658c212-serviceca\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.818090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vg96\" (UniqueName: \"kubernetes.io/projected/6ae40133-7bde-4034-8619-0eae17ab89a9-kube-api-access-6vg96\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d24e31e-3394-41c1-a9b3-f9761295236c-host-slash\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-hostroot\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2ff083b-5e25-4ad5-9ebe-7d015658c212-host\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818161 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-agent-certs\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-os-release\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-modprobe-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-sys\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdj6\" (UniqueName: \"kubernetes.io/projected/072c5e3f-6547-42c7-8e8e-c517d7283183-kube-api-access-npdj6\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-socket-dir-parent\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-multus\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-multus-certs\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-systemd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.818566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-script-lib\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-lib-modules\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-host\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-os-release\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-systemd-units\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818545 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/990242e6-25ef-4749-8d89-b0083d90c418-ovn-node-metrics-cert\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-run\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-tmp\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-etc-kubernetes\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-socket-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-conf\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-hosts-file\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnb2\" (UniqueName: \"kubernetes.io/projected/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-kube-api-access-7xnb2\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.819175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-registration-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-cnibin\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-log-socket\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jj7v\" (UniqueName: \"kubernetes.io/projected/990242e6-25ef-4749-8d89-b0083d90c418-kube-api-access-6jj7v\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-netns\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-systemd\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.818999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-kubelet\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-sys-fs\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-slash\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-netd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-multus-daemon-config\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtr2\" (UniqueName: \"kubernetes.io/projected/3feca2fd-76f4-4d80-9641-209f7a166211-kube-api-access-mqtr2\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-var-lib-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-cni-binary-copy\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.819713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-k8s-cni-cncf-io\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.820136 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-kubelet\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.820136 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.819258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.822254 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.822234 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:53.842999 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.842982 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fzxbr" Apr 17 17:07:53.851469 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.851454 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fzxbr" Apr 17 17:07:53.877169 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:53.877143 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4ee3205b15c3555a6b381bfd25f947.slice/crio-b59ae3810d436f8c6112221212b261670ab79f5b26a57509159afaf57532898f WatchSource:0}: Error finding container b59ae3810d436f8c6112221212b261670ab79f5b26a57509159afaf57532898f: Status 404 returned error can't find the container with id b59ae3810d436f8c6112221212b261670ab79f5b26a57509159afaf57532898f Apr 17 17:07:53.877398 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:53.877383 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b99da47aa26d733084fbd12fd690dc.slice/crio-297fc1d8fd4f9fb3653b30f4368c3160fbb72b0a289753f42afbc501848177c5 WatchSource:0}: Error finding container 297fc1d8fd4f9fb3653b30f4368c3160fbb72b0a289753f42afbc501848177c5: Status 404 returned error can't find the container with id 297fc1d8fd4f9fb3653b30f4368c3160fbb72b0a289753f42afbc501848177c5 Apr 17 17:07:53.882655 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.882639 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:07:53.920461 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-socket-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.920543 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-conf\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.920543 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-hosts-file\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnb2\" (UniqueName: \"kubernetes.io/projected/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-kube-api-access-7xnb2\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-socket-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-registration-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-hosts-file\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-cnibin\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-log-socket\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-conf\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.920659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-registration-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jj7v\" (UniqueName: \"kubernetes.io/projected/990242e6-25ef-4749-8d89-b0083d90c418-kube-api-access-6jj7v\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920679 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-log-socket\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-netns\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-cnibin\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-systemd\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-netns\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-systemd\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-kubelet\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-sys-fs\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-slash\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-kubelet\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-netd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-sys-fs\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-multus-daemon-config\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.920981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-netd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-slash\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtr2\" (UniqueName: \"kubernetes.io/projected/3feca2fd-76f4-4d80-9641-209f7a166211-kube-api-access-mqtr2\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-var-lib-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-cni-binary-copy\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-k8s-cni-cncf-io\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-kubelet\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-k8s-cni-cncf-io\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-var-lib-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-kubelet\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5d24e31e-3394-41c1-a9b3-f9761295236c-iptables-alerter-script\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-binary-copy\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-var-lib-kubelet\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-tmp-dir\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.921754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-etc-selinux\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-var-lib-kubelet\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv2jq\" (UniqueName: \"kubernetes.io/projected/44929b09-362a-4e1a-aa6b-88795d3cd03c-kube-api-access-fv2jq\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-system-cni-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-etc-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-bin\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-config\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-env-overrides\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-multus-daemon-config\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rr6v\" (UniqueName: \"kubernetes.io/projected/5d24e31e-3394-41c1-a9b3-f9761295236c-kube-api-access-5rr6v\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-etc-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-node-log\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-system-cni-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-cni-bin\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysconfig\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-tmp-dir\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-kubernetes\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysconfig\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.922502 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.921638 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3feca2fd-76f4-4d80-9641-209f7a166211-cni-binary-copy\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-842hx\" (UniqueName: \"kubernetes.io/projected/e2ff083b-5e25-4ad5-9ebe-7d015658c212-kube-api-access-842hx\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-kubernetes\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-etc-selinux\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.921723 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:54.421674036 +0000 UTC m=+2.089357686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-node-log\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-device-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-netns\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5d24e31e-3394-41c1-a9b3-f9761295236c-iptables-alerter-script\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-device-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-ovn\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-cnibin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-ovn\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-run-netns\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-conf-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-konnectivity-ca\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.923677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-cnibin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-conf-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8sck\" (UniqueName: \"kubernetes.io/projected/e3e9c005-8254-4300-8c36-63018e536c0f-kube-api-access-j8sck\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.921992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-tuned\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-config\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-system-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-bin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2ff083b-5e25-4ad5-9ebe-7d015658c212-serviceca\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vg96\" (UniqueName: \"kubernetes.io/projected/6ae40133-7bde-4034-8619-0eae17ab89a9-kube-api-access-6vg96\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d24e31e-3394-41c1-a9b3-f9761295236c-host-slash\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-hostroot\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-bin\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2ff083b-5e25-4ad5-9ebe-7d015658c212-host\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d24e31e-3394-41c1-a9b3-f9761295236c-host-slash\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2ff083b-5e25-4ad5-9ebe-7d015658c212-host\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-env-overrides\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.924151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-agent-certs\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-hostroot\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-system-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-os-release\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922301 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44929b09-362a-4e1a-aa6b-88795d3cd03c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-modprobe-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-os-release\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-konnectivity-ca\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-modprobe-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-sys\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-sysctl-d\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npdj6\" (UniqueName: \"kubernetes.io/projected/072c5e3f-6547-42c7-8e8e-c517d7283183-kube-api-access-npdj6\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2ff083b-5e25-4ad5-9ebe-7d015658c212-serviceca\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-sys\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.924650 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-socket-dir-parent\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-socket-dir-parent\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-multus\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-multus-certs\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-systemd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-script-lib\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-run-multus-certs\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-lib-modules\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-host\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-os-release\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-systemd\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-lib-modules\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-host\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-os-release\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-multus-cni-dir\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-host-var-lib-cni-multus\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925275 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-systemd-units\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-systemd-units\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/990242e6-25ef-4749-8d89-b0083d90c418-ovn-node-metrics-cert\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-run\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.922999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-run-openvswitch\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-tmp\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-etc-kubernetes\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ae40133-7bde-4034-8619-0eae17ab89a9-run\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/990242e6-25ef-4749-8d89-b0083d90c418-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3feca2fd-76f4-4d80-9641-209f7a166211-etc-kubernetes\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3e9c005-8254-4300-8c36-63018e536c0f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3e9c005-8254-4300-8c36-63018e536c0f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.923332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/990242e6-25ef-4749-8d89-b0083d90c418-ovnkube-script-lib\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.925190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-etc-tuned\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.925227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ae40133-7bde-4034-8619-0eae17ab89a9-tmp\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.925759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.925356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/990242e6-25ef-4749-8d89-b0083d90c418-ovn-node-metrics-cert\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.926252 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.925482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c39d4ed7-fddd-4bc2-8cfb-8d0155da370b-agent-certs\") pod \"konnectivity-agent-vf59c\" (UID: \"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b\") " pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:53.932667 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.932425 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:53.932667 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.932448 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:53.932667 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.932460 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:53.932667 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:53.932508 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:54.432492641 +0000 UTC m=+2.100176290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:53.933610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.933597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-842hx\" (UniqueName: \"kubernetes.io/projected/e2ff083b-5e25-4ad5-9ebe-7d015658c212-kube-api-access-842hx\") pod \"node-ca-vbgg6\" (UID: \"e2ff083b-5e25-4ad5-9ebe-7d015658c212\") " pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:53.934368 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.934317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv2jq\" (UniqueName: \"kubernetes.io/projected/44929b09-362a-4e1a-aa6b-88795d3cd03c-kube-api-access-fv2jq\") pod \"aws-ebs-csi-driver-node-nbvb9\" (UID: \"44929b09-362a-4e1a-aa6b-88795d3cd03c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:53.934724 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.934677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdj6\" (UniqueName: \"kubernetes.io/projected/072c5e3f-6547-42c7-8e8e-c517d7283183-kube-api-access-npdj6\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:53.934833 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.934726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rr6v\" (UniqueName: \"kubernetes.io/projected/5d24e31e-3394-41c1-a9b3-f9761295236c-kube-api-access-5rr6v\") pod \"iptables-alerter-vk4s9\" (UID: \"5d24e31e-3394-41c1-a9b3-f9761295236c\") " pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:53.934833 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.934828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtr2\" (UniqueName: \"kubernetes.io/projected/3feca2fd-76f4-4d80-9641-209f7a166211-kube-api-access-mqtr2\") pod \"multus-cjws9\" (UID: \"3feca2fd-76f4-4d80-9641-209f7a166211\") " pod="openshift-multus/multus-cjws9" Apr 17 17:07:53.935023 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.935002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnb2\" (UniqueName: \"kubernetes.io/projected/c3fbb6b8-715e-4512-b7ce-584ff3fdf72e-kube-api-access-7xnb2\") pod \"node-resolver-rxccf\" (UID: \"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e\") " pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:53.935276 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.935254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8sck\" (UniqueName: \"kubernetes.io/projected/e3e9c005-8254-4300-8c36-63018e536c0f-kube-api-access-j8sck\") pod \"multus-additional-cni-plugins-kn4sv\" (UID: \"e3e9c005-8254-4300-8c36-63018e536c0f\") " pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:53.935890 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.935875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jj7v\" (UniqueName: \"kubernetes.io/projected/990242e6-25ef-4749-8d89-b0083d90c418-kube-api-access-6jj7v\") pod \"ovnkube-node-8ltqc\" (UID: \"990242e6-25ef-4749-8d89-b0083d90c418\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:53.936097 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.936082 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vg96\" (UniqueName: \"kubernetes.io/projected/6ae40133-7bde-4034-8619-0eae17ab89a9-kube-api-access-6vg96\") pod \"tuned-s8v7c\" (UID: \"6ae40133-7bde-4034-8619-0eae17ab89a9\") " pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:53.948362 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.948330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" event={"ID":"5d4ee3205b15c3555a6b381bfd25f947","Type":"ContainerStarted","Data":"b59ae3810d436f8c6112221212b261670ab79f5b26a57509159afaf57532898f"} Apr 17 17:07:53.949255 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:53.949233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" event={"ID":"03b99da47aa26d733084fbd12fd690dc","Type":"ContainerStarted","Data":"297fc1d8fd4f9fb3653b30f4368c3160fbb72b0a289753f42afbc501848177c5"} Apr 17 17:07:54.124412 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.124346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vk4s9" Apr 17 17:07:54.127850 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.127836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:07:54.131121 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.131096 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d24e31e_3394_41c1_a9b3_f9761295236c.slice/crio-496cbbdf47ad3b328c4d2d97433fda0ef5213123225dbf445475d519e9a3d000 WatchSource:0}: Error finding container 496cbbdf47ad3b328c4d2d97433fda0ef5213123225dbf445475d519e9a3d000: Status 404 returned error can't find the container with id 496cbbdf47ad3b328c4d2d97433fda0ef5213123225dbf445475d519e9a3d000 Apr 17 17:07:54.135251 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.135229 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39d4ed7_fddd_4bc2_8cfb_8d0155da370b.slice/crio-01bdd52a58a387b42d3703fc604b15dc5511dd9ce25d471572943a2547b35f3f WatchSource:0}: Error finding container 01bdd52a58a387b42d3703fc604b15dc5511dd9ce25d471572943a2547b35f3f: Status 404 returned error can't find the container with id 01bdd52a58a387b42d3703fc604b15dc5511dd9ce25d471572943a2547b35f3f Apr 17 17:07:54.157823 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.157800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" Apr 17 17:07:54.163272 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.163254 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44929b09_362a_4e1a_aa6b_88795d3cd03c.slice/crio-2c86d07388862bcfad3f40b2381f0ec0f38b5119c2c0d4fc37a21d1b78a09989 WatchSource:0}: Error finding container 2c86d07388862bcfad3f40b2381f0ec0f38b5119c2c0d4fc37a21d1b78a09989: Status 404 returned error can't find the container with id 2c86d07388862bcfad3f40b2381f0ec0f38b5119c2c0d4fc37a21d1b78a09989 Apr 17 17:07:54.178708 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.178691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" Apr 17 17:07:54.184195 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.184175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cjws9" Apr 17 17:07:54.184629 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.184583 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e9c005_8254_4300_8c36_63018e536c0f.slice/crio-62ccfdd0645df5da4691db3fd93ab9092a150012b18f6f91c7958d0e387f4273 WatchSource:0}: Error finding container 62ccfdd0645df5da4691db3fd93ab9092a150012b18f6f91c7958d0e387f4273: Status 404 returned error can't find the container with id 62ccfdd0645df5da4691db3fd93ab9092a150012b18f6f91c7958d0e387f4273 Apr 17 17:07:54.189636 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.189614 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3feca2fd_76f4_4d80_9641_209f7a166211.slice/crio-0a3ba55ffa21cd0e55811bf9df9a1b655577df63ae98eb9370eac6a462df5d8c WatchSource:0}: Error finding container 0a3ba55ffa21cd0e55811bf9df9a1b655577df63ae98eb9370eac6a462df5d8c: Status 404 returned error can't find the container with id 0a3ba55ffa21cd0e55811bf9df9a1b655577df63ae98eb9370eac6a462df5d8c Apr 17 17:07:54.191424 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.191372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:07:54.196559 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.196544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" Apr 17 17:07:54.198338 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.198319 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod990242e6_25ef_4749_8d89_b0083d90c418.slice/crio-2a0bb262be7ecd25de0fbdeb1b85413e56fb3ec25be795cde481ed1000926e15 WatchSource:0}: Error finding container 2a0bb262be7ecd25de0fbdeb1b85413e56fb3ec25be795cde481ed1000926e15: Status 404 returned error can't find the container with id 2a0bb262be7ecd25de0fbdeb1b85413e56fb3ec25be795cde481ed1000926e15 Apr 17 17:07:54.202255 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.202238 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rxccf" Apr 17 17:07:54.203070 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.203054 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ae40133_7bde_4034_8619_0eae17ab89a9.slice/crio-106738a6037d799c47f7d581edb14ad75387a9581beac109b6dff1b3b97c738e WatchSource:0}: Error finding container 106738a6037d799c47f7d581edb14ad75387a9581beac109b6dff1b3b97c738e: Status 404 returned error can't find the container with id 106738a6037d799c47f7d581edb14ad75387a9581beac109b6dff1b3b97c738e Apr 17 17:07:54.206322 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.206273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vbgg6" Apr 17 17:07:54.208451 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.208434 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fbb6b8_715e_4512_b7ce_584ff3fdf72e.slice/crio-24a4a2573ee0defd14149e67a5c577885787b30eeb9399544a3f0568d49d2b73 WatchSource:0}: Error finding container 24a4a2573ee0defd14149e67a5c577885787b30eeb9399544a3f0568d49d2b73: Status 404 returned error can't find the container with id 24a4a2573ee0defd14149e67a5c577885787b30eeb9399544a3f0568d49d2b73 Apr 17 17:07:54.211980 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:07:54.211962 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ff083b_5e25_4ad5_9ebe_7d015658c212.slice/crio-92ec07adc08f6adf9a239988127b04757f06e1fab05543a271623c47c943529e WatchSource:0}: Error finding container 92ec07adc08f6adf9a239988127b04757f06e1fab05543a271623c47c943529e: Status 404 returned error can't find the container with id 92ec07adc08f6adf9a239988127b04757f06e1fab05543a271623c47c943529e Apr 17 17:07:54.426340 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.426313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:54.426490 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.426473 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:54.426553 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.426538 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:55.426518711 +0000 UTC m=+3.094202351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:54.527624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.527594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:54.527783 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.527759 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:54.527783 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.527778 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:54.527878 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.527791 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:54.527878 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:54.527845 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:55.527823872 +0000 UTC m=+3.195507521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:54.759265 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.757451 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:54.852628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.852588 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:53 +0000 UTC" deadline="2027-12-03 02:40:19.033039636 +0000 UTC" Apr 17 17:07:54.852628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.852627 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14265h32m24.180417043s" Apr 17 17:07:54.967857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.967725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" event={"ID":"44929b09-362a-4e1a-aa6b-88795d3cd03c","Type":"ContainerStarted","Data":"2c86d07388862bcfad3f40b2381f0ec0f38b5119c2c0d4fc37a21d1b78a09989"} Apr 17 17:07:54.983536 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.983504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vf59c" event={"ID":"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b","Type":"ContainerStarted","Data":"01bdd52a58a387b42d3703fc604b15dc5511dd9ce25d471572943a2547b35f3f"} Apr 17 17:07:54.991126 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:54.990864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" event={"ID":"6ae40133-7bde-4034-8619-0eae17ab89a9","Type":"ContainerStarted","Data":"106738a6037d799c47f7d581edb14ad75387a9581beac109b6dff1b3b97c738e"} Apr 17 17:07:55.017112 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.016840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cjws9" event={"ID":"3feca2fd-76f4-4d80-9641-209f7a166211","Type":"ContainerStarted","Data":"0a3ba55ffa21cd0e55811bf9df9a1b655577df63ae98eb9370eac6a462df5d8c"} Apr 17 17:07:55.028139 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.028075 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:55.042326 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.042221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vk4s9" event={"ID":"5d24e31e-3394-41c1-a9b3-f9761295236c","Type":"ContainerStarted","Data":"496cbbdf47ad3b328c4d2d97433fda0ef5213123225dbf445475d519e9a3d000"} Apr 17 17:07:55.047343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.047064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vbgg6" event={"ID":"e2ff083b-5e25-4ad5-9ebe-7d015658c212","Type":"ContainerStarted","Data":"92ec07adc08f6adf9a239988127b04757f06e1fab05543a271623c47c943529e"} Apr 17 17:07:55.067647 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.067622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rxccf" event={"ID":"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e","Type":"ContainerStarted","Data":"24a4a2573ee0defd14149e67a5c577885787b30eeb9399544a3f0568d49d2b73"} Apr 17 17:07:55.070929 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.070907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"2a0bb262be7ecd25de0fbdeb1b85413e56fb3ec25be795cde481ed1000926e15"} Apr 17 17:07:55.081321 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.080648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerStarted","Data":"62ccfdd0645df5da4691db3fd93ab9092a150012b18f6f91c7958d0e387f4273"} Apr 17 17:07:55.435077 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.435045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:55.435270 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.435177 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:55.435270 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.435260 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:57.435239846 +0000 UTC m=+5.102923489 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:55.536368 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.536305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:55.536536 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.536466 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:55.536536 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.536487 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:55.536536 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.536499 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:55.536699 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.536555 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:57.536534533 +0000 UTC m=+5.204218168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:55.552377 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.552353 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:55.853615 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.853495 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:53 +0000 UTC" deadline="2027-11-10 23:09:39.71706853 +0000 UTC" Apr 17 17:07:55.853615 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.853566 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13734h1m43.863507933s" Apr 17 17:07:55.945913 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.945871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:55.946085 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.946012 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:07:55.946425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:55.946402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:55.946529 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:55.946504 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:07:56.528914 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:56.528801 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:57.454027 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:57.453433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:57.454027 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.453605 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:57.454027 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.453668 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:01.45364982 +0000 UTC m=+9.121333466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:57.554708 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:57.554662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:57.554891 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.554839 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:57.554891 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.554859 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:57.554891 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.554871 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:57.555051 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.554931 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:01.554912607 +0000 UTC m=+9.222596241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:57.945806 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:57.945776 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:57.945998 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.945903 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:07:57.945998 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:57.945961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:57.946101 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:57.946070 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:07:59.945746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:59.945712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:07:59.946162 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:59.945854 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:07:59.946253 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:07:59.945712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:07:59.946312 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:07:59.946280 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:01.486891 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:01.486857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:01.487379 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.487017 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:01.487379 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.487092 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:09.487072878 +0000 UTC m=+17.154756525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:01.587504 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:01.587380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:01.587658 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.587525 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:08:01.587658 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.587549 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:08:01.587658 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.587565 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:01.587658 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.587624 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:09.58760545 +0000 UTC m=+17.255289086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:01.946124 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:01.946096 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:01.946124 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:01.946114 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:01.946374 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.946229 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:01.946374 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:01.946342 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:03.945872 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:03.945837 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:03.946310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:03.945842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:03.946310 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:03.945957 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:03.946310 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:03.946036 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:05.945645 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:05.945427 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:05.946063 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:05.945439 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:05.946063 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:05.945741 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:05.946063 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:05.945796 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:07.945333 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:07.945299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:07.945887 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:07.945307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:07.945887 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:07.945477 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:07.945887 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:07.945612 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:09.547957 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:09.547924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:09.548466 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.548061 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:09.548466 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.548134 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:25.548113648 +0000 UTC m=+33.215797295 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:09.648198 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:09.648167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:09.648366 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.648345 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:08:09.648420 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.648374 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:08:09.648420 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.648389 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:09.648527 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.648452 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:25.648432817 +0000 UTC m=+33.316116451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:09.946091 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:09.946057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:09.946261 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:09.946057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:09.946261 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.946198 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:09.946396 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:09.946299 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:11.946000 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:11.945972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:11.946343 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:11.946057 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:11.946343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:11.945976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:11.946343 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:11.946123 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:13.115519 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.115336 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:08:13.116310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116269 2572 generic.go:358] "Generic (PLEG): container finished" podID="990242e6-25ef-4749-8d89-b0083d90c418" containerID="51906383c375482fca8f91ec3ef08bb8f448722d181c9d429bbd2a3a78aef57d" exitCode=1 Apr 17 17:08:13.116407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"a8bcc16020acc1f4c6d9626bef0da8bcf6eb0838eb258ba955d51b7c4201fbe1"} Apr 17 17:08:13.116407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"15de08f7535c61fda6d13ab7fe0a5e2e3be9d0683815152a90ccb3fa4979b233"} Apr 17 17:08:13.116407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"16414c3b4aa72f6a07d4c6ca3497307a6145fe8222fdb1e332905c11c81e7db3"} Apr 17 17:08:13.116501 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"a05a451a441ea5222402e4a0ff86d60503482448a5748c18219858f200e37d80"} Apr 17 17:08:13.116501 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerDied","Data":"51906383c375482fca8f91ec3ef08bb8f448722d181c9d429bbd2a3a78aef57d"} Apr 17 17:08:13.116501 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.116450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"eb8c14d07558cfc69187fd751255725fa97bb70f57de7077b8afb7cecfb4a896"} Apr 17 17:08:13.117958 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.117862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" event={"ID":"6ae40133-7bde-4034-8619-0eae17ab89a9","Type":"ContainerStarted","Data":"8e1354d0938d5330988154433276dca91b499e5eec96b95560728eeba11e87da"} Apr 17 17:08:13.119195 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.119175 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cjws9" event={"ID":"3feca2fd-76f4-4d80-9641-209f7a166211","Type":"ContainerStarted","Data":"e792f5121326ee61befd5b58b22b7362933d420bbd80ffcb6f93ceb11c124d00"} Apr 17 17:08:13.120421 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.120401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" event={"ID":"03b99da47aa26d733084fbd12fd690dc","Type":"ContainerStarted","Data":"23203bb1a9f07105f6aa6e38d41d0973d2be22e158c282f71b75b5278949ffd5"} Apr 17 17:08:13.133772 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.133717 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s8v7c" podStartSLOduration=2.127194242 podStartE2EDuration="20.133700725s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.204985464 +0000 UTC m=+1.872669097" lastFinishedPulling="2026-04-17 17:08:12.211491945 +0000 UTC m=+19.879175580" observedRunningTime="2026-04-17 17:08:13.132830492 +0000 UTC m=+20.800514146" watchObservedRunningTime="2026-04-17 17:08:13.133700725 +0000 UTC m=+20.801384379" Apr 17 17:08:13.162953 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.162795 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cjws9" podStartSLOduration=1.876199133 podStartE2EDuration="20.162776648s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.192733967 +0000 UTC m=+1.860417599" lastFinishedPulling="2026-04-17 17:08:12.479311466 +0000 UTC m=+20.146995114" observedRunningTime="2026-04-17 17:08:13.162716655 +0000 UTC m=+20.830400313" watchObservedRunningTime="2026-04-17 17:08:13.162776648 +0000 UTC m=+20.830460305" Apr 17 17:08:13.163141 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.163120 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-224.ec2.internal" podStartSLOduration=20.16311359 podStartE2EDuration="20.16311359s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:13.145477561 +0000 UTC m=+20.813161215" watchObservedRunningTime="2026-04-17 17:08:13.16311359 +0000 UTC m=+20.830797244" Apr 17 17:08:13.945162 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.945137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:13.945248 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.945137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:13.945283 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:13.945272 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:13.945395 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:13.945377 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:13.961884 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:13.961857 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:08:14.123427 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.123358 2572 generic.go:358] "Generic (PLEG): container finished" podID="5d4ee3205b15c3555a6b381bfd25f947" containerID="39fdd7efb4635a317efb18140957febf068cac43cc43b1ccea1d9c2fc2dcdce6" exitCode=0 Apr 17 17:08:14.123776 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.123438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" event={"ID":"5d4ee3205b15c3555a6b381bfd25f947","Type":"ContainerDied","Data":"39fdd7efb4635a317efb18140957febf068cac43cc43b1ccea1d9c2fc2dcdce6"} Apr 17 17:08:14.123776 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.123590 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" Apr 17 17:08:14.124637 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.124615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vbgg6" event={"ID":"e2ff083b-5e25-4ad5-9ebe-7d015658c212","Type":"ContainerStarted","Data":"2cc00b37826a7b68ce86eabc0acc41f305fa21ca42f459dfe4f936441c7842ff"} Apr 17 17:08:14.125766 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.125746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rxccf" event={"ID":"c3fbb6b8-715e-4512-b7ce-584ff3fdf72e","Type":"ContainerStarted","Data":"8d772242be12179018f4ebccbeb6f129a6bf8a7e47c67e2547414d6b42b45be8"} Apr 17 17:08:14.127046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.127025 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="b0c72b90dbc48c37c9d0b38802041e55d659b73d4bbe6797797b5afc71a43fad" exitCode=0 Apr 17 17:08:14.127125 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.127099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"b0c72b90dbc48c37c9d0b38802041e55d659b73d4bbe6797797b5afc71a43fad"} Apr 17 17:08:14.128696 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.128672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" event={"ID":"44929b09-362a-4e1a-aa6b-88795d3cd03c","Type":"ContainerStarted","Data":"acaab72db50c583d4b0a960172d013afdb18af3fefa451304f980d4f8c2b6053"} Apr 17 17:08:14.128760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.128699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" event={"ID":"44929b09-362a-4e1a-aa6b-88795d3cd03c","Type":"ContainerStarted","Data":"50519d0ad16f72a120d877591538c80d6a6c7179c72e0845355e446d6f1db819"} Apr 17 17:08:14.129965 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.129938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vf59c" event={"ID":"c39d4ed7-fddd-4bc2-8cfb-8d0155da370b","Type":"ContainerStarted","Data":"691a8230ef49f27d645d3ea1ba5f8894f39da6d3d32b9154bc138500ae124f04"} Apr 17 17:08:14.131108 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.131087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vk4s9" event={"ID":"5d24e31e-3394-41c1-a9b3-f9761295236c","Type":"ContainerStarted","Data":"219fddacdeac6e4743f547b85e6d7d3eb0d6559e28c56f8b52b213ec34448b71"} Apr 17 17:08:14.134380 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.134362 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:08:14.135023 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.135006 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal"] Apr 17 17:08:14.138176 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.138119 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vbgg6" podStartSLOduration=3.142467332 podStartE2EDuration="21.138108997s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.213377888 +0000 UTC m=+1.881061521" lastFinishedPulling="2026-04-17 17:08:12.209019537 +0000 UTC m=+19.876703186" observedRunningTime="2026-04-17 17:08:14.137654336 +0000 UTC m=+21.805337991" watchObservedRunningTime="2026-04-17 17:08:14.138108997 +0000 UTC m=+21.805792652" Apr 17 17:08:14.153414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.153381 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vk4s9" podStartSLOduration=4.075156547 podStartE2EDuration="22.153370879s" podCreationTimestamp="2026-04-17 17:07:52 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.132975034 +0000 UTC m=+1.800658670" lastFinishedPulling="2026-04-17 17:08:12.211189368 +0000 UTC m=+19.878873002" observedRunningTime="2026-04-17 17:08:14.153137107 +0000 UTC m=+21.820820763" watchObservedRunningTime="2026-04-17 17:08:14.153370879 +0000 UTC m=+21.821054532" Apr 17 17:08:14.171032 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.170999 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vf59c" podStartSLOduration=3.123681892 podStartE2EDuration="21.170989911s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.136712244 +0000 UTC m=+1.804395876" lastFinishedPulling="2026-04-17 17:08:12.184020247 +0000 UTC m=+19.851703895" observedRunningTime="2026-04-17 17:08:14.170825132 +0000 UTC m=+21.838508786" watchObservedRunningTime="2026-04-17 17:08:14.170989911 +0000 UTC m=+21.838673565" Apr 17 17:08:14.200702 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.200669 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rxccf" podStartSLOduration=3.256092237 podStartE2EDuration="21.200659426s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.210191522 +0000 UTC m=+1.877875158" lastFinishedPulling="2026-04-17 17:08:12.154758713 +0000 UTC m=+19.822442347" observedRunningTime="2026-04-17 17:08:14.185841368 +0000 UTC m=+21.853525023" watchObservedRunningTime="2026-04-17 17:08:14.200659426 +0000 UTC m=+21.868343079" Apr 17 17:08:14.883875 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.883607 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:08:13.961880772Z","UUID":"f983f4ee-5632-4907-a9db-c3f67d22d25a","Handler":null,"Name":"","Endpoint":""} Apr 17 17:08:14.885434 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.885414 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:08:14.885534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:14.885443 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:08:15.134970 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.134874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" event={"ID":"44929b09-362a-4e1a-aa6b-88795d3cd03c","Type":"ContainerStarted","Data":"5a20cc37e2fd9997ed3161d763bd9ede9f1b0383c1e2e2bf381ffe831da60605"} Apr 17 17:08:15.136667 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.136616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" event={"ID":"5d4ee3205b15c3555a6b381bfd25f947","Type":"ContainerStarted","Data":"b7a4a8735a9a24a139335c91465bb55e4c9198e276758348f4c3282270d18a90"} Apr 17 17:08:15.139686 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.139665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:08:15.140102 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.140073 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"768ac77478d0962f6fc579754c8523713f56702fedcab7e2b0095dd417175f16"} Apr 17 17:08:15.152040 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.151993 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nbvb9" podStartSLOduration=1.5449132589999999 podStartE2EDuration="22.151982469s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.164580075 +0000 UTC m=+1.832263707" lastFinishedPulling="2026-04-17 17:08:14.771649285 +0000 UTC m=+22.439332917" observedRunningTime="2026-04-17 17:08:15.151609868 +0000 UTC m=+22.819293535" watchObservedRunningTime="2026-04-17 17:08:15.151982469 +0000 UTC m=+22.819666154" Apr 17 17:08:15.169343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.169298 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-224.ec2.internal" podStartSLOduration=1.169282964 podStartE2EDuration="1.169282964s" podCreationTimestamp="2026-04-17 17:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:15.169145422 +0000 UTC m=+22.836829077" watchObservedRunningTime="2026-04-17 17:08:15.169282964 +0000 UTC m=+22.836966620" Apr 17 17:08:15.775319 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.775289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:08:15.945297 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.945267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:15.945458 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:15.945267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:15.945458 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:15.945385 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:15.945564 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:15.945492 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:17.146320 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.146166 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:08:17.146827 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.146652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"99530d9da530aaea2186878b1e356c400e7115c7139775d6366906ccbc38bfc5"} Apr 17 17:08:17.146948 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.146928 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:17.147172 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.147139 2572 scope.go:117] "RemoveContainer" containerID="51906383c375482fca8f91ec3ef08bb8f448722d181c9d429bbd2a3a78aef57d" Apr 17 17:08:17.160915 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.160897 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:17.656670 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.656592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:08:17.657313 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.657289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:08:17.945160 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.945133 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:17.945353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:17.945134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:17.945353 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:17.945271 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:17.945353 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:17.945339 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:18.156194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.156139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:08:18.156827 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.156754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" event={"ID":"990242e6-25ef-4749-8d89-b0083d90c418","Type":"ContainerStarted","Data":"c697d28e97c9dcb29e1e58ae0ad2f97ff26a549d9e4cff249eb95c862dc1a902"} Apr 17 17:08:18.156902 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.156869 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:08:18.157438 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.157193 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:18.157898 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.157849 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vf59c" Apr 17 17:08:18.174704 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.174684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:18.186345 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:18.185429 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" podStartSLOduration=7.135613526 podStartE2EDuration="25.185414791s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.199857941 +0000 UTC m=+1.867541574" lastFinishedPulling="2026-04-17 17:08:12.249659207 +0000 UTC m=+19.917342839" observedRunningTime="2026-04-17 17:08:18.184734465 +0000 UTC m=+25.852418119" watchObservedRunningTime="2026-04-17 17:08:18.185414791 +0000 UTC m=+25.853098449" Apr 17 17:08:19.158853 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:19.158650 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:08:19.194687 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:19.194664 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6fpt"] Apr 17 17:08:19.194823 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:19.194786 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:19.194914 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:19.194896 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:19.197473 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:19.197452 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rqtb2"] Apr 17 17:08:19.197570 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:19.197518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:19.197614 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:19.197589 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:20.127601 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:20.127530 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:20.161412 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:20.161390 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="3343d8db5f1f753ac0816ebaf923540b3dac6e0674d50ca00cf140ee6fd78397" exitCode=0 Apr 17 17:08:20.161755 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:20.161489 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"3343d8db5f1f753ac0816ebaf923540b3dac6e0674d50ca00cf140ee6fd78397"} Apr 17 17:08:20.945475 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:20.945440 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:20.945475 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:20.945475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:20.945686 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:20.945588 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:20.945746 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:20.945725 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:21.178240 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:21.178172 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" podUID="990242e6-25ef-4749-8d89-b0083d90c418" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 17:08:22.166153 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:22.166117 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="d8646783a696a87514c60c105dbc84b8d255995be8079aec8b56795c856c6b46" exitCode=0 Apr 17 17:08:22.166308 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:22.166161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"d8646783a696a87514c60c105dbc84b8d255995be8079aec8b56795c856c6b46"} Apr 17 17:08:22.946537 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:22.946387 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:22.946900 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:22.946584 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:22.946900 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:22.946453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:22.946900 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:22.946717 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:23.170117 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:23.170041 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="89cd438ac7f7354f56127c851ca2f653499b7eaa0e1230175c9162ec5fc6bd81" exitCode=0 Apr 17 17:08:23.170117 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:23.170085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"89cd438ac7f7354f56127c851ca2f653499b7eaa0e1230175c9162ec5fc6bd81"} Apr 17 17:08:24.945999 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:24.945965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:24.946414 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:24.946097 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6fpt" podUID="072c5e3f-6547-42c7-8e8e-c517d7283183" Apr 17 17:08:24.946645 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:24.946616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:24.946770 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:24.946719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqtb2" podUID="f75def8c-a1a7-475d-934b-23dad26ea8c2" Apr 17 17:08:25.185020 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.184992 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-224.ec2.internal" event="NodeReady" Apr 17 17:08:25.185140 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.185109 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:08:25.232373 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.232310 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-77gh9"] Apr 17 17:08:25.235517 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.235496 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tjgfv"] Apr 17 17:08:25.235680 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.235664 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.238194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.238161 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:08:25.238310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.238237 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:08:25.238437 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.238422 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:08:25.239560 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.239543 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.242055 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.242040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:08:25.242161 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.242128 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:08:25.242161 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.242135 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:08:25.242161 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.242130 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:08:25.245624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.245602 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77gh9"] Apr 17 17:08:25.246628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.246607 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjgfv"] Apr 17 17:08:25.378096 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378074 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7mm\" (UniqueName: \"kubernetes.io/projected/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-kube-api-access-rg7mm\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.378247 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fm9\" (UniqueName: \"kubernetes.io/projected/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-kube-api-access-d5fm9\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.378247 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.378247 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-tmp-dir\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.378247 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-config-volume\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.378394 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.378274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.479337 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-config-volume\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.479466 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.479466 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7mm\" (UniqueName: \"kubernetes.io/projected/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-kube-api-access-rg7mm\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.479578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fm9\" (UniqueName: \"kubernetes.io/projected/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-kube-api-access-d5fm9\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.479578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.479578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-tmp-dir\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.479717 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.479646 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:25.479717 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.479694 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:25.479794 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.479725 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:25.979704789 +0000 UTC m=+33.647388442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:25.479794 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.479763 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:25.979744406 +0000 UTC m=+33.647428039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:25.479794 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-tmp-dir\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.479899 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.479872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-config-volume\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.489680 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.489629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fm9\" (UniqueName: \"kubernetes.io/projected/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-kube-api-access-d5fm9\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.489765 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.489727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7mm\" (UniqueName: \"kubernetes.io/projected/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-kube-api-access-rg7mm\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.580572 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.580551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:25.580680 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.580670 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:25.580761 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.580749 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:57.580732007 +0000 UTC m=+65.248415644 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:25.681355 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.681330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:25.681504 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.681486 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:08:25.681626 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.681512 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:08:25.681626 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.681524 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ztv9r for pod openshift-network-diagnostics/network-check-target-rqtb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:25.681626 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.681579 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r podName:f75def8c-a1a7-475d-934b-23dad26ea8c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:57.681563428 +0000 UTC m=+65.349247064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztv9r" (UniqueName: "kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r") pod "network-check-target-rqtb2" (UID: "f75def8c-a1a7-475d-934b-23dad26ea8c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:25.983991 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.983961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:25.984505 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:25.984002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:25.984505 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.984088 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:25.984505 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.984091 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:25.984505 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.984135 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:26.984122393 +0000 UTC m=+34.651806025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:25.984505 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:25.984146 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:26.984141143 +0000 UTC m=+34.651824775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:26.945727 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.945694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:26.945891 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.945871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:26.950110 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.948687 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:08:26.950110 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.949546 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:08:26.950110 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.949707 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s65fp\"" Apr 17 17:08:26.950110 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.949792 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:08:26.950110 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.949971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:08:26.990234 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.990197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:26.990560 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:26.990266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:26.990560 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:26.990332 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:26.990560 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:26.990338 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:26.990560 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:26.990389 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:28.990372232 +0000 UTC m=+36.658055865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:26.990560 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:26.990406 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:28.990399502 +0000 UTC m=+36.658083133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:29.004226 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:29.004182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:29.004766 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:29.004279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:29.004766 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:29.004314 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:29.004766 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:29.004386 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:29.004766 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:29.004390 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:33.004369762 +0000 UTC m=+40.672053395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:29.004766 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:29.004444 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:33.004429089 +0000 UTC m=+40.672112725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:32.188571 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:32.188544 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="2ff5e652191bedd78e144dd343a5814bd999d68275d9f0ee51045fd7a5f30195" exitCode=0 Apr 17 17:08:32.188983 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:32.188589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"2ff5e652191bedd78e144dd343a5814bd999d68275d9f0ee51045fd7a5f30195"} Apr 17 17:08:33.036410 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:33.036365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:33.036588 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:33.036422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:33.036588 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:33.036516 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:33.036588 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:33.036522 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:33.036588 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:33.036577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:41.036562164 +0000 UTC m=+48.704245795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:33.036588 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:33.036590 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:41.036584942 +0000 UTC m=+48.704268575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:33.193005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:33.192970 2572 generic.go:358] "Generic (PLEG): container finished" podID="e3e9c005-8254-4300-8c36-63018e536c0f" containerID="f7740ed8e68bb3c345e18e42de53a30902d9f3ffcb2e558d18d490bd521ff361" exitCode=0 Apr 17 17:08:33.193348 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:33.193022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerDied","Data":"f7740ed8e68bb3c345e18e42de53a30902d9f3ffcb2e558d18d490bd521ff361"} Apr 17 17:08:34.197267 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:34.197235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" event={"ID":"e3e9c005-8254-4300-8c36-63018e536c0f","Type":"ContainerStarted","Data":"134a0605056ee4014586cdf49d55a1d277b799aa023be0aa587a9a8f58bbec1b"} Apr 17 17:08:34.220101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:34.220046 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kn4sv" podStartSLOduration=3.868601559 podStartE2EDuration="41.220030138s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:07:54.18730381 +0000 UTC m=+1.854987457" lastFinishedPulling="2026-04-17 17:08:31.53873239 +0000 UTC m=+39.206416036" observedRunningTime="2026-04-17 17:08:34.218708848 +0000 UTC m=+41.886392503" watchObservedRunningTime="2026-04-17 17:08:34.220030138 +0000 UTC m=+41.887713815" Apr 17 17:08:41.086465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:41.086433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:41.086465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:41.086477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:41.086980 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:41.086590 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:41.086980 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:41.086658 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:57.086641813 +0000 UTC m=+64.754325448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:41.086980 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:41.086590 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:41.086980 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:41.086732 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:57.086720401 +0000 UTC m=+64.754404037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:51.173798 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:51.173763 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ltqc" Apr 17 17:08:57.091286 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.091250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:08:57.091286 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.091295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:08:57.091687 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.091385 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:57.091687 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.091388 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:57.091687 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.091436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:29.091421328 +0000 UTC m=+96.759104959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:08:57.091687 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.091448 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:29.091442505 +0000 UTC m=+96.759126137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:08:57.595570 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.595526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:08:57.598180 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.598161 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:08:57.606086 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.606072 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:08:57.606143 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:08:57.606133 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs podName:072c5e3f-6547-42c7-8e8e-c517d7283183 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:01.606117694 +0000 UTC m=+129.273801326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs") pod "network-metrics-daemon-h6fpt" (UID: "072c5e3f-6547-42c7-8e8e-c517d7283183") : secret "metrics-daemon-secret" not found Apr 17 17:08:57.696299 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.696273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:57.698981 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.698965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:08:57.708940 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.708921 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:08:57.720276 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.720255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztv9r\" (UniqueName: \"kubernetes.io/projected/f75def8c-a1a7-475d-934b-23dad26ea8c2-kube-api-access-ztv9r\") pod \"network-check-target-rqtb2\" (UID: \"f75def8c-a1a7-475d-934b-23dad26ea8c2\") " pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:57.866644 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.866556 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s65fp\"" Apr 17 17:08:57.874624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.874602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:08:57.994660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:57.994632 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rqtb2"] Apr 17 17:08:57.997814 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:08:57.997782 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75def8c_a1a7_475d_934b_23dad26ea8c2.slice/crio-ebb32158486cb4b00c7414147bd428d7738b7f84e0ad7ef38d65a0d00f22976a WatchSource:0}: Error finding container ebb32158486cb4b00c7414147bd428d7738b7f84e0ad7ef38d65a0d00f22976a: Status 404 returned error can't find the container with id ebb32158486cb4b00c7414147bd428d7738b7f84e0ad7ef38d65a0d00f22976a Apr 17 17:08:58.244168 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:08:58.244136 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rqtb2" event={"ID":"f75def8c-a1a7-475d-934b-23dad26ea8c2","Type":"ContainerStarted","Data":"ebb32158486cb4b00c7414147bd428d7738b7f84e0ad7ef38d65a0d00f22976a"} Apr 17 17:09:01.250565 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:01.250534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rqtb2" event={"ID":"f75def8c-a1a7-475d-934b-23dad26ea8c2","Type":"ContainerStarted","Data":"b8a4106524abd733414c39980b1699731702f1f45615d08ebf46cc5eb3390bf3"} Apr 17 17:09:01.250913 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:01.250713 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:09:01.264871 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:01.264829 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rqtb2" podStartSLOduration=66.587550498 podStartE2EDuration="1m9.264816987s" podCreationTimestamp="2026-04-17 17:07:52 +0000 UTC" firstStartedPulling="2026-04-17 17:08:57.999751878 +0000 UTC m=+65.667435733" lastFinishedPulling="2026-04-17 17:09:00.677018588 +0000 UTC m=+68.344702222" observedRunningTime="2026-04-17 17:09:01.264546531 +0000 UTC m=+68.932230216" watchObservedRunningTime="2026-04-17 17:09:01.264816987 +0000 UTC m=+68.932500642" Apr 17 17:09:25.263880 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.263758 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49"] Apr 17 17:09:25.267937 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.267917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.271715 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.271685 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-l8dx9\"" Apr 17 17:09:25.271863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.271719 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:09:25.271863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.271685 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:09:25.273100 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.273081 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:09:25.273187 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.273128 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:09:25.276386 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.276352 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49"] Apr 17 17:09:25.365159 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.365133 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b86f88f8-qvj8p"] Apr 17 17:09:25.367835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.367819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.370430 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gpprd\"" Apr 17 17:09:25.370540 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370468 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:09:25.370540 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370436 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:09:25.370540 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zp8\" (UniqueName: \"kubernetes.io/projected/d7991c94-ae1a-4579-9bde-b10b5d113e64-kube-api-access-m9zp8\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.370540 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370512 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:09:25.370753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7991c94-ae1a-4579-9bde-b10b5d113e64-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.370753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370452 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:09:25.370753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370488 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:09:25.370753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.370753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.370443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:09:25.379397 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.379374 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b86f88f8-qvj8p"] Apr 17 17:09:25.471447 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-default-certificate\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.471572 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.471572 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zp8\" (UniqueName: \"kubernetes.io/projected/d7991c94-ae1a-4579-9bde-b10b5d113e64-kube-api-access-m9zp8\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.471686 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7991c94-ae1a-4579-9bde-b10b5d113e64-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.471686 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-stats-auth\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.471686 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48tn\" (UniqueName: \"kubernetes.io/projected/394bfa3f-360e-4dbb-ab25-846a90b23983-kube-api-access-s48tn\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.471819 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.471819 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.471809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.471916 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.471903 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:25.471989 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.471977 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:25.971959351 +0000 UTC m=+93.639643010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:25.472222 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.472186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7991c94-ae1a-4579-9bde-b10b5d113e64-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.478784 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.478761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zp8\" (UniqueName: \"kubernetes.io/projected/d7991c94-ae1a-4579-9bde-b10b5d113e64-kube-api-access-m9zp8\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.572064 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.572012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.572064 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.572061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-stats-auth\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.572239 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.572078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s48tn\" (UniqueName: \"kubernetes.io/projected/394bfa3f-360e-4dbb-ab25-846a90b23983-kube-api-access-s48tn\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.572239 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.572096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.572239 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.572159 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:25.572239 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.572188 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:26.072176179 +0000 UTC m=+93.739859811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:25.572431 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.572249 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:26.072230396 +0000 UTC m=+93.739914059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:25.572431 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.572314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-default-certificate\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.574344 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.574322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-stats-auth\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.574553 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.574507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-default-certificate\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.580828 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.580801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48tn\" (UniqueName: \"kubernetes.io/projected/394bfa3f-360e-4dbb-ab25-846a90b23983-kube-api-access-s48tn\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:25.975652 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:25.975623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:25.975776 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.975724 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:25.975776 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:25.975774 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:26.975761045 +0000 UTC m=+94.643444678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:26.076726 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:26.076690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:26.076855 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:26.076765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:26.076855 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:26.076821 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:26.076928 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:26.076874 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:27.076860185 +0000 UTC m=+94.744543817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:26.076928 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:26.076900 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:27.076887552 +0000 UTC m=+94.744571184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:26.982118 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:26.982077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:26.982602 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:26.982223 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:26.982602 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:26.982288 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:28.982273932 +0000 UTC m=+96.649957564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:27.083255 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.083221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:27.083379 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.083297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:27.083379 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.083351 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:27.083476 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.083415 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:29.083398577 +0000 UTC m=+96.751082209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:27.083476 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.083432 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:29.083425583 +0000 UTC m=+96.751109215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:27.221015 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.220987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn"] Apr 17 17:09:27.223931 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.223909 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.226529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.226502 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:27.226617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.226539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:09:27.227461 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.227441 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:09:27.227590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.227446 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wt4lq\"" Apr 17 17:09:27.233004 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.232955 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn"] Apr 17 17:09:27.284667 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.284632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.284774 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.284732 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6flrp\" (UniqueName: \"kubernetes.io/projected/e2318d5b-dbf4-4dc3-bc9a-23b972419390-kube-api-access-6flrp\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.385171 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.385148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6flrp\" (UniqueName: \"kubernetes.io/projected/e2318d5b-dbf4-4dc3-bc9a-23b972419390-kube-api-access-6flrp\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.385291 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.385181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.385343 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.385291 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:27.385343 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.385330 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:27.885318663 +0000 UTC m=+95.553002296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:27.395420 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.395398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flrp\" (UniqueName: \"kubernetes.io/projected/e2318d5b-dbf4-4dc3-bc9a-23b972419390-kube-api-access-6flrp\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.887892 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:27.887868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:27.888025 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.888005 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:27.888069 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:27.888058 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:28.888045221 +0000 UTC m=+96.555728858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:28.222423 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.222389 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz"] Apr 17 17:09:28.225297 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.225280 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" Apr 17 17:09:28.227846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.227825 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:28.227950 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.227863 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fp74m\"" Apr 17 17:09:28.228906 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.228887 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:09:28.234103 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.234082 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz"] Apr 17 17:09:28.292120 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.292088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6gj\" (UniqueName: \"kubernetes.io/projected/507cdf5e-4dc4-4544-aa29-50b8e78da951-kube-api-access-7x6gj\") pod \"volume-data-source-validator-7c6cbb6c87-7gpcz\" (UID: \"507cdf5e-4dc4-4544-aa29-50b8e78da951\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" Apr 17 17:09:28.393347 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.393325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6gj\" (UniqueName: \"kubernetes.io/projected/507cdf5e-4dc4-4544-aa29-50b8e78da951-kube-api-access-7x6gj\") pod \"volume-data-source-validator-7c6cbb6c87-7gpcz\" (UID: \"507cdf5e-4dc4-4544-aa29-50b8e78da951\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" Apr 17 17:09:28.402153 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.402134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6gj\" (UniqueName: \"kubernetes.io/projected/507cdf5e-4dc4-4544-aa29-50b8e78da951-kube-api-access-7x6gj\") pod \"volume-data-source-validator-7c6cbb6c87-7gpcz\" (UID: \"507cdf5e-4dc4-4544-aa29-50b8e78da951\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" Apr 17 17:09:28.535151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.535123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" Apr 17 17:09:28.642048 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.642017 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz"] Apr 17 17:09:28.645039 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:28.645016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507cdf5e_4dc4_4544_aa29_50b8e78da951.slice/crio-0b2925711d5508c906706bf674a808afa703294442c21e928ff0e40eee70d424 WatchSource:0}: Error finding container 0b2925711d5508c906706bf674a808afa703294442c21e928ff0e40eee70d424: Status 404 returned error can't find the container with id 0b2925711d5508c906706bf674a808afa703294442c21e928ff0e40eee70d424 Apr 17 17:09:28.896254 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.896179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:28.896345 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:28.896326 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:28.896393 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:28.896387 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:30.89637121 +0000 UTC m=+98.564054841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:28.996985 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:28.996961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:28.997081 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:28.997050 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:28.997121 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:28.997091 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:32.997079251 +0000 UTC m=+100.664762884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:29.097891 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:29.097867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:09:29.097988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:29.097901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:29.097988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:29.097935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:09:29.097988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:29.097972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098036 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:33.098017254 +0000 UTC m=+100.765700889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098037 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098074 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098044 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098083 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls podName:05633129-a6c5-4b2a-9ddc-4a376e6b79c3 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:33.098074562 +0000 UTC m=+160.765758198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls") pod "dns-default-77gh9" (UID: "05633129-a6c5-4b2a-9ddc-4a376e6b79c3") : secret "dns-default-metrics-tls" not found Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098127 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert podName:8cd72e65-31b5-4a4a-acf8-3800bb1d5898 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:33.098114138 +0000 UTC m=+160.765797769 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert") pod "ingress-canary-tjgfv" (UID: "8cd72e65-31b5-4a4a-acf8-3800bb1d5898") : secret "canary-serving-cert" not found Apr 17 17:09:29.098136 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:29.098139 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:33.098133147 +0000 UTC m=+100.765816835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:29.305570 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:29.305534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" event={"ID":"507cdf5e-4dc4-4544-aa29-50b8e78da951","Type":"ContainerStarted","Data":"0b2925711d5508c906706bf674a808afa703294442c21e928ff0e40eee70d424"} Apr 17 17:09:30.912306 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:30.912267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:30.912743 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:30.912403 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:30.912743 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:30.912469 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:34.912451752 +0000 UTC m=+102.580135385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:31.311475 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:31.311443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" event={"ID":"507cdf5e-4dc4-4544-aa29-50b8e78da951","Type":"ContainerStarted","Data":"bf95acd398c674a3a9e222068106d3e616f915d0e42b3ec4a5a0f0cb16e61607"} Apr 17 17:09:31.326293 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:31.326249 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7gpcz" podStartSLOduration=1.731215117 podStartE2EDuration="3.326235961s" podCreationTimestamp="2026-04-17 17:09:28 +0000 UTC" firstStartedPulling="2026-04-17 17:09:28.647312837 +0000 UTC m=+96.314996469" lastFinishedPulling="2026-04-17 17:09:30.242333681 +0000 UTC m=+97.910017313" observedRunningTime="2026-04-17 17:09:31.325252416 +0000 UTC m=+98.992936070" watchObservedRunningTime="2026-04-17 17:09:31.326235961 +0000 UTC m=+98.993919645" Apr 17 17:09:31.785652 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:31.785626 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rxccf_c3fbb6b8-715e-4512-b7ce-584ff3fdf72e/dns-node-resolver/0.log" Apr 17 17:09:32.255678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:32.255650 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rqtb2" Apr 17 17:09:32.585607 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:32.585536 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vbgg6_e2ff083b-5e25-4ad5-9ebe-7d015658c212/node-ca/0.log" Apr 17 17:09:33.028520 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.028490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:33.028692 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:33.028601 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:33.028692 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:33.028669 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:41.028653484 +0000 UTC m=+108.696337116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:33.129707 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.129665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:33.129846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.129735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:33.129846 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:33.129798 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:33.129927 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:33.129849 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:41.129835107 +0000 UTC m=+108.797518739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:33.129927 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:33.129864 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:41.129856871 +0000 UTC m=+108.797540502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:33.233637 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.233598 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm"] Apr 17 17:09:33.235471 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.235456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.237893 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.237873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:09:33.237893 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.237883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:09:33.237893 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.237889 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:33.238086 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.237915 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:09:33.238830 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.238816 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bs9kh\"" Apr 17 17:09:33.246187 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.246167 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm"] Apr 17 17:09:33.331839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.331776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d537b0-8e43-49ac-aaf8-dc6d4576a650-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.331839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.331804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d537b0-8e43-49ac-aaf8-dc6d4576a650-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.331839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.331827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhttd\" (UniqueName: \"kubernetes.io/projected/03d537b0-8e43-49ac-aaf8-dc6d4576a650-kube-api-access-nhttd\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.432451 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.432410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d537b0-8e43-49ac-aaf8-dc6d4576a650-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.432451 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.432457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d537b0-8e43-49ac-aaf8-dc6d4576a650-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.432626 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.432477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhttd\" (UniqueName: \"kubernetes.io/projected/03d537b0-8e43-49ac-aaf8-dc6d4576a650-kube-api-access-nhttd\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.433808 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.433782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d537b0-8e43-49ac-aaf8-dc6d4576a650-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.434602 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.434585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d537b0-8e43-49ac-aaf8-dc6d4576a650-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.440235 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.440190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhttd\" (UniqueName: \"kubernetes.io/projected/03d537b0-8e43-49ac-aaf8-dc6d4576a650-kube-api-access-nhttd\") pod \"kube-storage-version-migrator-operator-6769c5d45-4wfsm\" (UID: \"03d537b0-8e43-49ac-aaf8-dc6d4576a650\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.544081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.544057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" Apr 17 17:09:33.654582 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:33.654550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm"] Apr 17 17:09:33.657113 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:33.657086 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d537b0_8e43_49ac_aaf8_dc6d4576a650.slice/crio-eb54a1742282a7c7b69fc8480c4083980c1cac243b6ed3c7668e9bc4b63bdc28 WatchSource:0}: Error finding container eb54a1742282a7c7b69fc8480c4083980c1cac243b6ed3c7668e9bc4b63bdc28: Status 404 returned error can't find the container with id eb54a1742282a7c7b69fc8480c4083980c1cac243b6ed3c7668e9bc4b63bdc28 Apr 17 17:09:34.317676 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:34.317628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" event={"ID":"03d537b0-8e43-49ac-aaf8-dc6d4576a650","Type":"ContainerStarted","Data":"eb54a1742282a7c7b69fc8480c4083980c1cac243b6ed3c7668e9bc4b63bdc28"} Apr 17 17:09:34.944059 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:34.944025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:34.944518 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:34.944159 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:34.944518 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:34.944271 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:42.944248651 +0000 UTC m=+110.611932294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:35.290949 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.290871 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554"] Apr 17 17:09:35.292873 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.292851 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.296117 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.296088 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:09:35.296246 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.296115 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:35.296246 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.296128 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:09:35.297029 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.296998 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-mrxxg\"" Apr 17 17:09:35.297029 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.297003 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:09:35.301173 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.300764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554"] Apr 17 17:09:35.347728 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.347694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcrx\" (UniqueName: \"kubernetes.io/projected/7564c7de-af3f-4337-99b0-e2adc3471cf9-kube-api-access-khcrx\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.347899 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.347748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7564c7de-af3f-4337-99b0-e2adc3471cf9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.347899 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.347790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7564c7de-af3f-4337-99b0-e2adc3471cf9-config\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.449005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.448980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khcrx\" (UniqueName: \"kubernetes.io/projected/7564c7de-af3f-4337-99b0-e2adc3471cf9-kube-api-access-khcrx\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.449098 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.449026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7564c7de-af3f-4337-99b0-e2adc3471cf9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.449098 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.449045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7564c7de-af3f-4337-99b0-e2adc3471cf9-config\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.449676 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.449656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7564c7de-af3f-4337-99b0-e2adc3471cf9-config\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.451141 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.451114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7564c7de-af3f-4337-99b0-e2adc3471cf9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.457055 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.457035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcrx\" (UniqueName: \"kubernetes.io/projected/7564c7de-af3f-4337-99b0-e2adc3471cf9-kube-api-access-khcrx\") pod \"service-ca-operator-d6fc45fc5-kz554\" (UID: \"7564c7de-af3f-4337-99b0-e2adc3471cf9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.604122 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.604041 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" Apr 17 17:09:35.645049 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.643134 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99"] Apr 17 17:09:35.647088 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.647039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" Apr 17 17:09:35.650065 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.650042 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-r57cx\"" Apr 17 17:09:35.654750 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.654229 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99"] Apr 17 17:09:35.722044 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.722016 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554"] Apr 17 17:09:35.724181 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:35.724155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7564c7de_af3f_4337_99b0_e2adc3471cf9.slice/crio-3231d8a3a0286f11bf899bfb199651ef2d8d344cdf0ef902267844ffbf05fd52 WatchSource:0}: Error finding container 3231d8a3a0286f11bf899bfb199651ef2d8d344cdf0ef902267844ffbf05fd52: Status 404 returned error can't find the container with id 3231d8a3a0286f11bf899bfb199651ef2d8d344cdf0ef902267844ffbf05fd52 Apr 17 17:09:35.751639 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.751613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fh8r\" (UniqueName: \"kubernetes.io/projected/f17b9df2-0db9-4e34-a061-e1439528bab6-kube-api-access-5fh8r\") pod \"network-check-source-8894fc9bd-mvw99\" (UID: \"f17b9df2-0db9-4e34-a061-e1439528bab6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" Apr 17 17:09:35.853037 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.853008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fh8r\" (UniqueName: \"kubernetes.io/projected/f17b9df2-0db9-4e34-a061-e1439528bab6-kube-api-access-5fh8r\") pod \"network-check-source-8894fc9bd-mvw99\" (UID: \"f17b9df2-0db9-4e34-a061-e1439528bab6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" Apr 17 17:09:35.860916 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.860862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fh8r\" (UniqueName: \"kubernetes.io/projected/f17b9df2-0db9-4e34-a061-e1439528bab6-kube-api-access-5fh8r\") pod \"network-check-source-8894fc9bd-mvw99\" (UID: \"f17b9df2-0db9-4e34-a061-e1439528bab6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" Apr 17 17:09:35.959619 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:35.959585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" Apr 17 17:09:36.071753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.071731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99"] Apr 17 17:09:36.073920 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:36.073886 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17b9df2_0db9_4e34_a061_e1439528bab6.slice/crio-9d39fdcbd87aaf05748e6f477ea5868697a2ff4c7ab719255176d175a9961dde WatchSource:0}: Error finding container 9d39fdcbd87aaf05748e6f477ea5868697a2ff4c7ab719255176d175a9961dde: Status 404 returned error can't find the container with id 9d39fdcbd87aaf05748e6f477ea5868697a2ff4c7ab719255176d175a9961dde Apr 17 17:09:36.322935 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.322882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" event={"ID":"03d537b0-8e43-49ac-aaf8-dc6d4576a650","Type":"ContainerStarted","Data":"4a122f8f43c8dab2b06dbedd2d55f3872d643b72887d2d64298f12d893e953c4"} Apr 17 17:09:36.324701 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.324641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" event={"ID":"f17b9df2-0db9-4e34-a061-e1439528bab6","Type":"ContainerStarted","Data":"756306bc9b53d3136c40873c591f447855efa6e800ca7567f5e37dcc2ab10d46"} Apr 17 17:09:36.324701 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.324673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" event={"ID":"f17b9df2-0db9-4e34-a061-e1439528bab6","Type":"ContainerStarted","Data":"9d39fdcbd87aaf05748e6f477ea5868697a2ff4c7ab719255176d175a9961dde"} Apr 17 17:09:36.326280 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.326255 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" event={"ID":"7564c7de-af3f-4337-99b0-e2adc3471cf9","Type":"ContainerStarted","Data":"3231d8a3a0286f11bf899bfb199651ef2d8d344cdf0ef902267844ffbf05fd52"} Apr 17 17:09:36.338694 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:36.338647 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" podStartSLOduration=1.5808884079999999 podStartE2EDuration="3.338632345s" podCreationTimestamp="2026-04-17 17:09:33 +0000 UTC" firstStartedPulling="2026-04-17 17:09:33.658938824 +0000 UTC m=+101.326622456" lastFinishedPulling="2026-04-17 17:09:35.416682743 +0000 UTC m=+103.084366393" observedRunningTime="2026-04-17 17:09:36.337544198 +0000 UTC m=+104.005227845" watchObservedRunningTime="2026-04-17 17:09:36.338632345 +0000 UTC m=+104.006316000" Apr 17 17:09:38.331932 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:38.331897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" event={"ID":"7564c7de-af3f-4337-99b0-e2adc3471cf9","Type":"ContainerStarted","Data":"df02dfe81b9b0bcbd17eda46484da07381cb43e19076b47c6a601cc7006e92e5"} Apr 17 17:09:38.346640 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:38.346596 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" podStartSLOduration=1.591296631 podStartE2EDuration="3.346572155s" podCreationTimestamp="2026-04-17 17:09:35 +0000 UTC" firstStartedPulling="2026-04-17 17:09:35.725973284 +0000 UTC m=+103.393656916" lastFinishedPulling="2026-04-17 17:09:37.481248804 +0000 UTC m=+105.148932440" observedRunningTime="2026-04-17 17:09:38.345479749 +0000 UTC m=+106.013163402" watchObservedRunningTime="2026-04-17 17:09:38.346572155 +0000 UTC m=+106.014255808" Apr 17 17:09:38.347027 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:38.346998 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mvw99" podStartSLOduration=3.346989673 podStartE2EDuration="3.346989673s" podCreationTimestamp="2026-04-17 17:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:09:36.352339404 +0000 UTC m=+104.020023059" watchObservedRunningTime="2026-04-17 17:09:38.346989673 +0000 UTC m=+106.014673373" Apr 17 17:09:41.094863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:41.094825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:41.095322 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:41.094985 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:41.095322 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:41.095055 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls podName:d7991c94-ae1a-4579-9bde-b10b5d113e64 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:57.095038219 +0000 UTC m=+124.762721851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-88j49" (UID: "d7991c94-ae1a-4579-9bde-b10b5d113e64") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:09:41.195401 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:41.195365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:41.195555 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:41.195467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:41.195555 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:41.195532 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:57.195512968 +0000 UTC m=+124.863196600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : configmap references non-existent config key: service-ca.crt Apr 17 17:09:41.195635 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:41.195559 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:09:41.195635 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:41.195624 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs podName:394bfa3f-360e-4dbb-ab25-846a90b23983 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:57.195611915 +0000 UTC m=+124.863295560 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs") pod "router-default-7b86f88f8-qvj8p" (UID: "394bfa3f-360e-4dbb-ab25-846a90b23983") : secret "router-metrics-certs-default" not found Apr 17 17:09:43.009820 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:43.009776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:43.010235 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:43.009890 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:09:43.010235 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:09:43.009946 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls podName:e2318d5b-dbf4-4dc3-bc9a-23b972419390 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:59.009933259 +0000 UTC m=+126.677616891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zrctn" (UID: "e2318d5b-dbf4-4dc3-bc9a-23b972419390") : secret "samples-operator-tls" not found Apr 17 17:09:57.119394 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.119350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:57.121827 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.121804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7991c94-ae1a-4579-9bde-b10b5d113e64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-88j49\" (UID: \"d7991c94-ae1a-4579-9bde-b10b5d113e64\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:57.220100 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.220069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:57.220257 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.220120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:57.220639 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.220620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/394bfa3f-360e-4dbb-ab25-846a90b23983-service-ca-bundle\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:57.222780 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.222761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394bfa3f-360e-4dbb-ab25-846a90b23983-metrics-certs\") pod \"router-default-7b86f88f8-qvj8p\" (UID: \"394bfa3f-360e-4dbb-ab25-846a90b23983\") " pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:57.380381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.380320 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-l8dx9\"" Apr 17 17:09:57.388291 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.388276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" Apr 17 17:09:57.478965 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.478802 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gpprd\"" Apr 17 17:09:57.487106 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.487031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:57.522910 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.522882 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49"] Apr 17 17:09:57.527147 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:57.527116 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7991c94_ae1a_4579_9bde_b10b5d113e64.slice/crio-d1f37b77639217e4c01f36f5700dc5a3ffcb9e5f77f4473cffa01d5aca3ee09c WatchSource:0}: Error finding container d1f37b77639217e4c01f36f5700dc5a3ffcb9e5f77f4473cffa01d5aca3ee09c: Status 404 returned error can't find the container with id d1f37b77639217e4c01f36f5700dc5a3ffcb9e5f77f4473cffa01d5aca3ee09c Apr 17 17:09:57.602189 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:57.602085 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b86f88f8-qvj8p"] Apr 17 17:09:57.604384 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:09:57.604362 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod394bfa3f_360e_4dbb_ab25_846a90b23983.slice/crio-c1f44077313f98925f73d373177c105187b4b4836594d5efcc88f806a960c107 WatchSource:0}: Error finding container c1f44077313f98925f73d373177c105187b4b4836594d5efcc88f806a960c107: Status 404 returned error can't find the container with id c1f44077313f98925f73d373177c105187b4b4836594d5efcc88f806a960c107 Apr 17 17:09:58.378181 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.378132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" event={"ID":"d7991c94-ae1a-4579-9bde-b10b5d113e64","Type":"ContainerStarted","Data":"d1f37b77639217e4c01f36f5700dc5a3ffcb9e5f77f4473cffa01d5aca3ee09c"} Apr 17 17:09:58.379350 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.379314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" event={"ID":"394bfa3f-360e-4dbb-ab25-846a90b23983","Type":"ContainerStarted","Data":"45fdb8f3e5112170328536ce1142c611e792d71db87ccb9f43531448c62dca1a"} Apr 17 17:09:58.379350 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.379345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" event={"ID":"394bfa3f-360e-4dbb-ab25-846a90b23983","Type":"ContainerStarted","Data":"c1f44077313f98925f73d373177c105187b4b4836594d5efcc88f806a960c107"} Apr 17 17:09:58.398257 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.398197 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" podStartSLOduration=33.398185841 podStartE2EDuration="33.398185841s" podCreationTimestamp="2026-04-17 17:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:09:58.397074369 +0000 UTC m=+126.064758023" watchObservedRunningTime="2026-04-17 17:09:58.398185841 +0000 UTC m=+126.065869473" Apr 17 17:09:58.487586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.487558 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:58.489857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:58.489837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:59.034061 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.034026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:59.036857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.036828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2318d5b-dbf4-4dc3-bc9a-23b972419390-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zrctn\" (UID: \"e2318d5b-dbf4-4dc3-bc9a-23b972419390\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:59.335475 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.335394 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wt4lq\"" Apr 17 17:09:59.343603 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.343579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" Apr 17 17:09:59.381724 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.381694 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:59.383107 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.383085 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b86f88f8-qvj8p" Apr 17 17:09:59.532687 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:09:59.532660 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn"] Apr 17 17:10:00.384028 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:00.383995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" event={"ID":"e2318d5b-dbf4-4dc3-bc9a-23b972419390","Type":"ContainerStarted","Data":"f236690cdb1f52ccecc4ddc5a83f430514889ebb07aedd5d1f9113d7b55df5a4"} Apr 17 17:10:00.385216 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:00.385187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" event={"ID":"d7991c94-ae1a-4579-9bde-b10b5d113e64","Type":"ContainerStarted","Data":"47c5925d495d01715e7202586bce73aae10d791bc7668899f210febf9655fe26"} Apr 17 17:10:00.403590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:00.403546 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-88j49" podStartSLOduration=33.480360824 podStartE2EDuration="35.403532824s" podCreationTimestamp="2026-04-17 17:09:25 +0000 UTC" firstStartedPulling="2026-04-17 17:09:57.529263026 +0000 UTC m=+125.196946658" lastFinishedPulling="2026-04-17 17:09:59.452435022 +0000 UTC m=+127.120118658" observedRunningTime="2026-04-17 17:10:00.401781014 +0000 UTC m=+128.069464669" watchObservedRunningTime="2026-04-17 17:10:00.403532824 +0000 UTC m=+128.071216472" Apr 17 17:10:01.654689 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:01.654654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:10:01.657247 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:01.657203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/072c5e3f-6547-42c7-8e8e-c517d7283183-metrics-certs\") pod \"network-metrics-daemon-h6fpt\" (UID: \"072c5e3f-6547-42c7-8e8e-c517d7283183\") " pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:10:01.760570 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:01.760544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:10:01.769474 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:01.769454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6fpt" Apr 17 17:10:01.880381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:01.880354 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6fpt"] Apr 17 17:10:01.884035 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:01.884012 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod072c5e3f_6547_42c7_8e8e_c517d7283183.slice/crio-d8678d9e72ef4867012d69235ba34713d8ee7c0e212373b15cb640de91ad9592 WatchSource:0}: Error finding container d8678d9e72ef4867012d69235ba34713d8ee7c0e212373b15cb640de91ad9592: Status 404 returned error can't find the container with id d8678d9e72ef4867012d69235ba34713d8ee7c0e212373b15cb640de91ad9592 Apr 17 17:10:02.391823 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:02.391792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6fpt" event={"ID":"072c5e3f-6547-42c7-8e8e-c517d7283183","Type":"ContainerStarted","Data":"d8678d9e72ef4867012d69235ba34713d8ee7c0e212373b15cb640de91ad9592"} Apr 17 17:10:02.393465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:02.393432 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" event={"ID":"e2318d5b-dbf4-4dc3-bc9a-23b972419390","Type":"ContainerStarted","Data":"dfbd1bb58e89ea1a4fe2b760c2a44f4be11b08a635226a354cabf8e3d59b97aa"} Apr 17 17:10:02.393465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:02.393465 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" event={"ID":"e2318d5b-dbf4-4dc3-bc9a-23b972419390","Type":"ContainerStarted","Data":"0a3c2ab82cf5f8655f74744655d35331b1da0b9e5207dc73136d7426a2d3db29"} Apr 17 17:10:02.412732 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:02.409922 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zrctn" podStartSLOduration=33.565890733 podStartE2EDuration="35.409904726s" podCreationTimestamp="2026-04-17 17:09:27 +0000 UTC" firstStartedPulling="2026-04-17 17:09:59.579575149 +0000 UTC m=+127.247258781" lastFinishedPulling="2026-04-17 17:10:01.423589142 +0000 UTC m=+129.091272774" observedRunningTime="2026-04-17 17:10:02.40832267 +0000 UTC m=+130.076006324" watchObservedRunningTime="2026-04-17 17:10:02.409904726 +0000 UTC m=+130.077588380" Apr 17 17:10:03.397174 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:03.397096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6fpt" event={"ID":"072c5e3f-6547-42c7-8e8e-c517d7283183","Type":"ContainerStarted","Data":"1e6efbda80cce9f87b78fd5b8c48e916d2137067f7e0bfcb9e56aca90c51e154"} Apr 17 17:10:03.397174 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:03.397138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6fpt" event={"ID":"072c5e3f-6547-42c7-8e8e-c517d7283183","Type":"ContainerStarted","Data":"790d0a197802e3ca7a23232690f7838261a374f1c796659a1fc808f449b59779"} Apr 17 17:10:03.412389 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:03.412337 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h6fpt" podStartSLOduration=129.202536852 podStartE2EDuration="2m10.412319113s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:10:01.885779121 +0000 UTC m=+129.553462753" lastFinishedPulling="2026-04-17 17:10:03.095561381 +0000 UTC m=+130.763245014" observedRunningTime="2026-04-17 17:10:03.411390376 +0000 UTC m=+131.079074029" watchObservedRunningTime="2026-04-17 17:10:03.412319113 +0000 UTC m=+131.080002769" Apr 17 17:10:06.325960 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.325922 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4"] Apr 17 17:10:06.327996 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.327974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.331806 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.331771 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fsqzv\"" Apr 17 17:10:06.331806 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.331796 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:10:06.331957 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.331873 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:10:06.344304 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.344281 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4"] Apr 17 17:10:06.387298 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.387270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b54101b-e6bf-474e-95bf-7a3894ef0486-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.387395 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.387367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5b54101b-e6bf-474e-95bf-7a3894ef0486-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.428579 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.428556 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9"] Apr 17 17:10:06.430412 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.430398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:06.433617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.433594 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9bfdb\"" Apr 17 17:10:06.434045 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.434027 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:10:06.434256 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.434235 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:06.437708 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.437688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.440302 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440281 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:10:06.440565 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wb5f4\"" Apr 17 17:10:06.440672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440591 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:10:06.440672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440629 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:10:06.440886 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440867 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:10:06.440992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.440927 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:10:06.441388 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.441373 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:10:06.441471 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.441419 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:10:06.444281 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.444262 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9"] Apr 17 17:10:06.455100 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.455077 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tghn4"] Apr 17 17:10:06.459044 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.459026 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:06.459243 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.459226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.463924 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.463903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:10:06.464467 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.464452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:10:06.465531 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.465513 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:10:06.465745 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.465730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hg6xj\"" Apr 17 17:10:06.465795 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.465767 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:10:06.482771 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.482752 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tghn4"] Apr 17 17:10:06.488341 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.488420 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.488420 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81b4269c-846c-46c3-9c36-0aa0083de609-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.488420 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7rwf\" (UniqueName: \"kubernetes.io/projected/81b4269c-846c-46c3-9c36-0aa0083de609-kube-api-access-v7rwf\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.488535 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5b54101b-e6bf-474e-95bf-7a3894ef0486-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.488574 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.488635 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/cd8e3ca4-bd8f-42be-8464-9caa8f36f300-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wsbz9\" (UID: \"cd8e3ca4-bd8f-42be-8464-9caa8f36f300\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:06.488635 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.488703 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b54101b-e6bf-474e-95bf-7a3894ef0486-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.488738 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81b4269c-846c-46c3-9c36-0aa0083de609-crio-socket\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.488782 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26dr\" (UniqueName: \"kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.488834 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81b4269c-846c-46c3-9c36-0aa0083de609-data-volume\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.488834 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81b4269c-846c-46c3-9c36-0aa0083de609-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.488937 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.488883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.489312 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.489295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b54101b-e6bf-474e-95bf-7a3894ef0486-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.490800 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.490782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5b54101b-e6bf-474e-95bf-7a3894ef0486-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qm5w4\" (UID: \"5b54101b-e6bf-474e-95bf-7a3894ef0486\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.530946 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.530928 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-555cf688b8-smdbn"] Apr 17 17:10:06.532871 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.532856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.535827 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.535812 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:10:06.535905 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.535814 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:10:06.536337 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.536322 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b4xpj\"" Apr 17 17:10:06.536663 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.536643 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:10:06.541149 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.541128 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:10:06.547251 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.547231 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-555cf688b8-smdbn"] Apr 17 17:10:06.589832 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.589832 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjxll\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-kube-api-access-zjxll\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.589832 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/cd8e3ca4-bd8f-42be-8464-9caa8f36f300-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wsbz9\" (UID: \"cd8e3ca4-bd8f-42be-8464-9caa8f36f300\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:06.589971 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-ca-trust-extracted\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.589971 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-trusted-ca\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.589978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81b4269c-846c-46c3-9c36-0aa0083de609-crio-socket\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590094 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z26dr\" (UniqueName: \"kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-certificates\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590193 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81b4269c-846c-46c3-9c36-0aa0083de609-data-volume\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590274 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81b4269c-846c-46c3-9c36-0aa0083de609-crio-socket\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590274 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-tls\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81b4269c-846c-46c3-9c36-0aa0083de609-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-bound-sa-token\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590438 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590438 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-image-registry-private-configuration\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81b4269c-846c-46c3-9c36-0aa0083de609-data-volume\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.590677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81b4269c-846c-46c3-9c36-0aa0083de609-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7rwf\" (UniqueName: \"kubernetes.io/projected/81b4269c-846c-46c3-9c36-0aa0083de609-kube-api-access-v7rwf\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.590821 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-installation-pull-secrets\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.590914 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.590885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81b4269c-846c-46c3-9c36-0aa0083de609-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.591026 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.591004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.591126 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.591103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.592634 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.592603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/cd8e3ca4-bd8f-42be-8464-9caa8f36f300-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wsbz9\" (UID: \"cd8e3ca4-bd8f-42be-8464-9caa8f36f300\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:06.592737 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.592663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.592799 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.592757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.592918 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.592902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81b4269c-846c-46c3-9c36-0aa0083de609-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.600909 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.600888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26dr\" (UniqueName: \"kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr\") pod \"console-546867ff6c-k7mp9\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.601136 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.601120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7rwf\" (UniqueName: \"kubernetes.io/projected/81b4269c-846c-46c3-9c36-0aa0083de609-kube-api-access-v7rwf\") pod \"insights-runtime-extractor-tghn4\" (UID: \"81b4269c-846c-46c3-9c36-0aa0083de609\") " pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.637281 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.637261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" Apr 17 17:10:06.692085 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-certificates\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692196 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-tls\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692196 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692168 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-bound-sa-token\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692367 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-image-registry-private-configuration\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692423 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-installation-pull-secrets\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxll\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-kube-api-access-zjxll\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692522 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-ca-trust-extracted\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.692522 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.692495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-trusted-ca\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.693088 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.693060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-ca-trust-extracted\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.693191 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.693147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-certificates\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.693809 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.693765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-trusted-ca\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.695479 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.695450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-installation-pull-secrets\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.695642 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.695618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-registry-tls\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.696934 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.696134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-image-registry-private-configuration\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.700258 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.700239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-bound-sa-token\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.701656 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.701625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxll\" (UniqueName: \"kubernetes.io/projected/9cad22a0-bc57-4643-ab1b-ce70d19c1c47-kube-api-access-zjxll\") pod \"image-registry-555cf688b8-smdbn\" (UID: \"9cad22a0-bc57-4643-ab1b-ce70d19c1c47\") " pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.746400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.746375 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:06.750885 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.749556 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4"] Apr 17 17:10:06.752135 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.752092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:06.754228 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:06.754187 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b54101b_e6bf_474e_95bf_7a3894ef0486.slice/crio-91fc339c2f8d4bffe74a99cad61143605ef56a0c1cc13ebe8d475e31e856f022 WatchSource:0}: Error finding container 91fc339c2f8d4bffe74a99cad61143605ef56a0c1cc13ebe8d475e31e856f022: Status 404 returned error can't find the container with id 91fc339c2f8d4bffe74a99cad61143605ef56a0c1cc13ebe8d475e31e856f022 Apr 17 17:10:06.767330 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.767311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tghn4" Apr 17 17:10:06.842547 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.842482 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:06.886807 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:06.886760 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8e3ca4_bd8f_42be_8464_9caa8f36f300.slice/crio-26dfeb28da1424c795875f66a7b38fc65ca13c4e9d9af2a1810ddb50b261c6bc WatchSource:0}: Error finding container 26dfeb28da1424c795875f66a7b38fc65ca13c4e9d9af2a1810ddb50b261c6bc: Status 404 returned error can't find the container with id 26dfeb28da1424c795875f66a7b38fc65ca13c4e9d9af2a1810ddb50b261c6bc Apr 17 17:10:06.887204 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.887180 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9"] Apr 17 17:10:06.973653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:06.973615 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-555cf688b8-smdbn"] Apr 17 17:10:06.976260 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:06.976237 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cad22a0_bc57_4643_ab1b_ce70d19c1c47.slice/crio-d2e24f429477b4eb9359c05b3952d89294255cffc0cf9678ab347a5767080d8d WatchSource:0}: Error finding container d2e24f429477b4eb9359c05b3952d89294255cffc0cf9678ab347a5767080d8d: Status 404 returned error can't find the container with id d2e24f429477b4eb9359c05b3952d89294255cffc0cf9678ab347a5767080d8d Apr 17 17:10:07.107264 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.107180 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:07.110247 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:07.110200 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod078e4a62_1e16_43bc_8318_2e2a520db01f.slice/crio-0100d628b5a2f623219b245b41b6f4953d196482e987a6ab1b06698a747897b4 WatchSource:0}: Error finding container 0100d628b5a2f623219b245b41b6f4953d196482e987a6ab1b06698a747897b4: Status 404 returned error can't find the container with id 0100d628b5a2f623219b245b41b6f4953d196482e987a6ab1b06698a747897b4 Apr 17 17:10:07.112670 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.112563 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tghn4"] Apr 17 17:10:07.117538 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:07.117513 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81b4269c_846c_46c3_9c36_0aa0083de609.slice/crio-1cbd31d5b82114606103bea4b028402edf7b9295b6257b2697d43b1e8e0cfa07 WatchSource:0}: Error finding container 1cbd31d5b82114606103bea4b028402edf7b9295b6257b2697d43b1e8e0cfa07: Status 404 returned error can't find the container with id 1cbd31d5b82114606103bea4b028402edf7b9295b6257b2697d43b1e8e0cfa07 Apr 17 17:10:07.410096 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.409999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546867ff6c-k7mp9" event={"ID":"078e4a62-1e16-43bc-8318-2e2a520db01f","Type":"ContainerStarted","Data":"0100d628b5a2f623219b245b41b6f4953d196482e987a6ab1b06698a747897b4"} Apr 17 17:10:07.411169 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.411139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" event={"ID":"cd8e3ca4-bd8f-42be-8464-9caa8f36f300","Type":"ContainerStarted","Data":"26dfeb28da1424c795875f66a7b38fc65ca13c4e9d9af2a1810ddb50b261c6bc"} Apr 17 17:10:07.412638 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.412611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" event={"ID":"9cad22a0-bc57-4643-ab1b-ce70d19c1c47","Type":"ContainerStarted","Data":"acbe13b38267d1773dfc5ebd96776b701230ca7b8067a4a3ac4c08f60a54c19b"} Apr 17 17:10:07.412775 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.412648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" event={"ID":"9cad22a0-bc57-4643-ab1b-ce70d19c1c47","Type":"ContainerStarted","Data":"d2e24f429477b4eb9359c05b3952d89294255cffc0cf9678ab347a5767080d8d"} Apr 17 17:10:07.412775 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.412686 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:07.414001 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.413944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tghn4" event={"ID":"81b4269c-846c-46c3-9c36-0aa0083de609","Type":"ContainerStarted","Data":"0b596cab2ae63c400f5afe57404ec9905f95a496154adea5236b66acc5ee8e21"} Apr 17 17:10:07.414001 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.413979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tghn4" event={"ID":"81b4269c-846c-46c3-9c36-0aa0083de609","Type":"ContainerStarted","Data":"1cbd31d5b82114606103bea4b028402edf7b9295b6257b2697d43b1e8e0cfa07"} Apr 17 17:10:07.415068 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.415046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" event={"ID":"5b54101b-e6bf-474e-95bf-7a3894ef0486","Type":"ContainerStarted","Data":"91fc339c2f8d4bffe74a99cad61143605ef56a0c1cc13ebe8d475e31e856f022"} Apr 17 17:10:07.431782 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:07.431729 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" podStartSLOduration=1.431713144 podStartE2EDuration="1.431713144s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:10:07.430271539 +0000 UTC m=+135.097955194" watchObservedRunningTime="2026-04-17 17:10:07.431713144 +0000 UTC m=+135.099396799" Apr 17 17:10:08.419729 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:08.419695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tghn4" event={"ID":"81b4269c-846c-46c3-9c36-0aa0083de609","Type":"ContainerStarted","Data":"562311d03e4189ed432032a0f222aaf8fff5011ea018319d876cf815c8b00ff9"} Apr 17 17:10:08.421073 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:08.421050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" event={"ID":"5b54101b-e6bf-474e-95bf-7a3894ef0486","Type":"ContainerStarted","Data":"06e5d3193d3078828a0f2f49cb0733f0e8deac8e13cb563fb2862b6082f00668"} Apr 17 17:10:08.437915 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:08.437848 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qm5w4" podStartSLOduration=1.006080505 podStartE2EDuration="2.437831913s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:06.757033851 +0000 UTC m=+134.424717483" lastFinishedPulling="2026-04-17 17:10:08.188785242 +0000 UTC m=+135.856468891" observedRunningTime="2026-04-17 17:10:08.435691534 +0000 UTC m=+136.103375189" watchObservedRunningTime="2026-04-17 17:10:08.437831913 +0000 UTC m=+136.105515567" Apr 17 17:10:09.426151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:09.426115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" event={"ID":"cd8e3ca4-bd8f-42be-8464-9caa8f36f300","Type":"ContainerStarted","Data":"063d5a289e2d7353fa566e70c00344e99c07fbe4af704aee4ec6cbfb77328078"} Apr 17 17:10:09.426614 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:09.426492 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:09.431588 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:09.431551 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" Apr 17 17:10:09.442531 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:09.442481 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wsbz9" podStartSLOduration=1.9681354 podStartE2EDuration="3.442447523s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:06.889916464 +0000 UTC m=+134.557600096" lastFinishedPulling="2026-04-17 17:10:08.364228573 +0000 UTC m=+136.031912219" observedRunningTime="2026-04-17 17:10:09.441570517 +0000 UTC m=+137.109254163" watchObservedRunningTime="2026-04-17 17:10:09.442447523 +0000 UTC m=+137.110131176" Apr 17 17:10:09.994770 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:09.994725 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-28tp8"] Apr 17 17:10:10.019840 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.019813 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-28tp8"] Apr 17 17:10:10.020007 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.019927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.022458 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.022427 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:10:10.022458 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.022453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-pfxks\"" Apr 17 17:10:10.022628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.022437 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:10:10.022628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.022437 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:10:10.124228 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.124179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.124349 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.124306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.124403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.124346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.124467 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.124410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9csh\" (UniqueName: \"kubernetes.io/projected/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-kube-api-access-z9csh\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.225490 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.225445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.225657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.225494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.225657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.225554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9csh\" (UniqueName: \"kubernetes.io/projected/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-kube-api-access-z9csh\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.225657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.225590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.225815 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:10.225740 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:10:10.225815 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:10.225809 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls podName:c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:10.725789447 +0000 UTC m=+138.393473097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-28tp8" (UID: "c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5") : secret "prometheus-operator-tls" not found Apr 17 17:10:10.226352 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.226324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.228061 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.228037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.234778 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.234753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9csh\" (UniqueName: \"kubernetes.io/projected/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-kube-api-access-z9csh\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.730443 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:10.730407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:10.730809 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:10.730538 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:10:10.730809 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:10.730604 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls podName:c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:11.730588061 +0000 UTC m=+139.398271712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-28tp8" (UID: "c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5") : secret "prometheus-operator-tls" not found Apr 17 17:10:11.433969 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.433934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tghn4" event={"ID":"81b4269c-846c-46c3-9c36-0aa0083de609","Type":"ContainerStarted","Data":"dc7f6b3c9c1727bdf62fb99f793f802b0a4316f16c4cccbf6f2e94f1f851c275"} Apr 17 17:10:11.435513 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.435489 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546867ff6c-k7mp9" event={"ID":"078e4a62-1e16-43bc-8318-2e2a520db01f","Type":"ContainerStarted","Data":"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97"} Apr 17 17:10:11.454683 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.454629 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tghn4" podStartSLOduration=1.727716558 podStartE2EDuration="5.454617686s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:07.171838122 +0000 UTC m=+134.839521761" lastFinishedPulling="2026-04-17 17:10:10.898739257 +0000 UTC m=+138.566422889" observedRunningTime="2026-04-17 17:10:11.454383908 +0000 UTC m=+139.122067564" watchObservedRunningTime="2026-04-17 17:10:11.454617686 +0000 UTC m=+139.122301383" Apr 17 17:10:11.470732 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.470690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-546867ff6c-k7mp9" podStartSLOduration=1.680343563 podStartE2EDuration="5.470678845s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:07.112521208 +0000 UTC m=+134.780204840" lastFinishedPulling="2026-04-17 17:10:10.902856487 +0000 UTC m=+138.570540122" observedRunningTime="2026-04-17 17:10:11.469813143 +0000 UTC m=+139.137496803" watchObservedRunningTime="2026-04-17 17:10:11.470678845 +0000 UTC m=+139.138362498" Apr 17 17:10:11.738675 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.738603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:11.741260 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.741237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-28tp8\" (UID: \"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:11.839625 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.839599 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" Apr 17 17:10:11.956224 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:11.956002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-28tp8"] Apr 17 17:10:11.958534 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:11.958507 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc548aaa0_7fdb_46f1_8ebb_9f5a8edd1ee5.slice/crio-822e244806a9d75f80cff100bfd257b088cfeecf1dc344b876d4a7eddf83d3d3 WatchSource:0}: Error finding container 822e244806a9d75f80cff100bfd257b088cfeecf1dc344b876d4a7eddf83d3d3: Status 404 returned error can't find the container with id 822e244806a9d75f80cff100bfd257b088cfeecf1dc344b876d4a7eddf83d3d3 Apr 17 17:10:12.438906 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:12.438874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" event={"ID":"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5","Type":"ContainerStarted","Data":"822e244806a9d75f80cff100bfd257b088cfeecf1dc344b876d4a7eddf83d3d3"} Apr 17 17:10:13.445245 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:13.445179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" event={"ID":"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5","Type":"ContainerStarted","Data":"c345895ae89416eadad355b927c08e22bb5754a5ab37d5c0082bc4f3bbfb73ba"} Apr 17 17:10:13.445245 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:13.445249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" event={"ID":"c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5","Type":"ContainerStarted","Data":"3c9ecbdb11866763ff74e39b63c4523a79f839e1fd5a5a1285f07ce1d8a5763f"} Apr 17 17:10:15.365677 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.365627 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-28tp8" podStartSLOduration=5.08542852 podStartE2EDuration="6.365612019s" podCreationTimestamp="2026-04-17 17:10:09 +0000 UTC" firstStartedPulling="2026-04-17 17:10:11.960368847 +0000 UTC m=+139.628052483" lastFinishedPulling="2026-04-17 17:10:13.240552346 +0000 UTC m=+140.908235982" observedRunningTime="2026-04-17 17:10:13.462292994 +0000 UTC m=+141.129976652" watchObservedRunningTime="2026-04-17 17:10:15.365612019 +0000 UTC m=+143.033295674" Apr 17 17:10:15.366696 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.366677 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jxtms"] Apr 17 17:10:15.369092 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.369071 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.371567 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.371543 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:10:15.371692 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.371566 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:10:15.371692 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.371663 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:10:15.372001 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.371979 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wcb6d\"" Apr 17 17:10:15.467588 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-root\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467729 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467729 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-metrics-client-ca\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467729 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-wtmp\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467862 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-sys\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467862 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-textfile\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467862 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-accelerators-collector-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.467988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.467903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c49dn\" (UniqueName: \"kubernetes.io/projected/e4762e69-9a43-41b3-9bb8-2a302e94867a-kube-api-access-c49dn\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568516 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-wtmp\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-sys\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-textfile\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-accelerators-collector-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c49dn\" (UniqueName: \"kubernetes.io/projected/e4762e69-9a43-41b3-9bb8-2a302e94867a-kube-api-access-c49dn\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-wtmp\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-sys\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-root\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e4762e69-9a43-41b3-9bb8-2a302e94867a-root\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-metrics-client-ca\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:15.568889 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.568914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-textfile\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.568974 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:15.568954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls podName:e4762e69-9a43-41b3-9bb8-2a302e94867a nodeName:}" failed. No retries permitted until 2026-04-17 17:10:16.068931671 +0000 UTC m=+143.736615303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls") pod "node-exporter-jxtms" (UID: "e4762e69-9a43-41b3-9bb8-2a302e94867a") : secret "node-exporter-tls" not found Apr 17 17:10:15.569291 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.569228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-accelerators-collector-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.569400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.569379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4762e69-9a43-41b3-9bb8-2a302e94867a-metrics-client-ca\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.570880 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.570862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:15.577642 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:15.577619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c49dn\" (UniqueName: \"kubernetes.io/projected/e4762e69-9a43-41b3-9bb8-2a302e94867a-kube-api-access-c49dn\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:16.072368 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.072329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:16.074665 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.074645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e4762e69-9a43-41b3-9bb8-2a302e94867a-node-exporter-tls\") pod \"node-exporter-jxtms\" (UID: \"e4762e69-9a43-41b3-9bb8-2a302e94867a\") " pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:16.278098 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.278068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jxtms" Apr 17 17:10:16.287438 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:16.287409 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4762e69_9a43_41b3_9bb8_2a302e94867a.slice/crio-c19799510f0a2950ae903e5d5a4f67d2a044f7441c705d672c93a331b962bc2e WatchSource:0}: Error finding container c19799510f0a2950ae903e5d5a4f67d2a044f7441c705d672c93a331b962bc2e: Status 404 returned error can't find the container with id c19799510f0a2950ae903e5d5a4f67d2a044f7441c705d672c93a331b962bc2e Apr 17 17:10:16.454681 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.454650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jxtms" event={"ID":"e4762e69-9a43-41b3-9bb8-2a302e94867a","Type":"ContainerStarted","Data":"c19799510f0a2950ae903e5d5a4f67d2a044f7441c705d672c93a331b962bc2e"} Apr 17 17:10:16.752278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.752192 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:16.752278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.752244 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:16.753317 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.753290 2572 patch_prober.go:28] interesting pod/console-546867ff6c-k7mp9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" start-of-body= Apr 17 17:10:16.753461 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:16.753337 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-546867ff6c-k7mp9" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerName="console" probeResult="failure" output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" Apr 17 17:10:17.462509 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:17.462468 2572 generic.go:358] "Generic (PLEG): container finished" podID="e4762e69-9a43-41b3-9bb8-2a302e94867a" containerID="01e895bd41740416a7bad0730adeaaabbfbc06d536e4542aa95396abfa08f0fe" exitCode=0 Apr 17 17:10:17.462853 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:17.462556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jxtms" event={"ID":"e4762e69-9a43-41b3-9bb8-2a302e94867a","Type":"ContainerDied","Data":"01e895bd41740416a7bad0730adeaaabbfbc06d536e4542aa95396abfa08f0fe"} Apr 17 17:10:18.466892 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:18.466861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jxtms" event={"ID":"e4762e69-9a43-41b3-9bb8-2a302e94867a","Type":"ContainerStarted","Data":"37ef512504790d3741dcc1e06363c16c679ed3c1c2dbc91d69470ad6c0d18554"} Apr 17 17:10:18.466892 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:18.466897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jxtms" event={"ID":"e4762e69-9a43-41b3-9bb8-2a302e94867a","Type":"ContainerStarted","Data":"d98de80e1b988284bce94fa8ee25feb8168a5500ef91e45aba2f52da6237b86b"} Apr 17 17:10:18.489224 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:18.489167 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jxtms" podStartSLOduration=2.656509088 podStartE2EDuration="3.489153489s" podCreationTimestamp="2026-04-17 17:10:15 +0000 UTC" firstStartedPulling="2026-04-17 17:10:16.289261397 +0000 UTC m=+143.956945028" lastFinishedPulling="2026-04-17 17:10:17.121905797 +0000 UTC m=+144.789589429" observedRunningTime="2026-04-17 17:10:18.488685652 +0000 UTC m=+146.156369307" watchObservedRunningTime="2026-04-17 17:10:18.489153489 +0000 UTC m=+146.156837142" Apr 17 17:10:20.542127 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.542095 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:20.574119 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.574088 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:10:20.576370 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.576356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.584310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.584289 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:10:20.587703 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.587681 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:10:20.604688 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.604792 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.604792 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.604901 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604818 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.604968 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.604968 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glhh2\" (UniqueName: \"kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.605085 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.604975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705295 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glhh2\" (UniqueName: \"kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.705624 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.706025 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.705990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.706132 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.706110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.706227 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.706188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.706278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.706255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.707931 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.707907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.708009 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.707941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.712844 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.712825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glhh2\" (UniqueName: \"kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2\") pod \"console-559bf5db75-cbrwz\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:20.886036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:20.885962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:21.004048 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:21.003989 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:10:21.006704 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:21.006675 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd624f9ab_15cd_408f_9a39_dbcd09170944.slice/crio-c0c89afb97e4748e794ef6f525aa5f7ae744d5adcfc5371f54cad6baa4174735 WatchSource:0}: Error finding container c0c89afb97e4748e794ef6f525aa5f7ae744d5adcfc5371f54cad6baa4174735: Status 404 returned error can't find the container with id c0c89afb97e4748e794ef6f525aa5f7ae744d5adcfc5371f54cad6baa4174735 Apr 17 17:10:21.476626 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:21.476582 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-559bf5db75-cbrwz" event={"ID":"d624f9ab-15cd-408f-9a39-dbcd09170944","Type":"ContainerStarted","Data":"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26"} Apr 17 17:10:21.476626 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:21.476630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-559bf5db75-cbrwz" event={"ID":"d624f9ab-15cd-408f-9a39-dbcd09170944","Type":"ContainerStarted","Data":"c0c89afb97e4748e794ef6f525aa5f7ae744d5adcfc5371f54cad6baa4174735"} Apr 17 17:10:21.494396 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:21.494352 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-559bf5db75-cbrwz" podStartSLOduration=1.494338574 podStartE2EDuration="1.494338574s" podCreationTimestamp="2026-04-17 17:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:10:21.492624963 +0000 UTC m=+149.160308654" watchObservedRunningTime="2026-04-17 17:10:21.494338574 +0000 UTC m=+149.162022228" Apr 17 17:10:26.846264 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:26.846200 2572 patch_prober.go:28] interesting pod/image-registry-555cf688b8-smdbn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:10:26.846688 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:26.846284 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" podUID="9cad22a0-bc57-4643-ab1b-ce70d19c1c47" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:10:28.246905 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:28.246871 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-77gh9" podUID="05633129-a6c5-4b2a-9ddc-4a376e6b79c3" Apr 17 17:10:28.253066 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:28.253039 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tjgfv" podUID="8cd72e65-31b5-4a4a-acf8-3800bb1d5898" Apr 17 17:10:28.425514 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:28.425494 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-555cf688b8-smdbn" Apr 17 17:10:28.495526 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:28.495497 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:10:28.495526 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:28.495517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:30.886451 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:30.886411 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:30.886451 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:30.886459 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:30.891037 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:30.891015 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:31.512469 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:31.512440 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:10:33.198292 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.198261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:33.198292 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.198295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:10:33.200595 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.200569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05633129-a6c5-4b2a-9ddc-4a376e6b79c3-metrics-tls\") pod \"dns-default-77gh9\" (UID: \"05633129-a6c5-4b2a-9ddc-4a376e6b79c3\") " pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:33.200732 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.200603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd72e65-31b5-4a4a-acf8-3800bb1d5898-cert\") pod \"ingress-canary-tjgfv\" (UID: \"8cd72e65-31b5-4a4a-acf8-3800bb1d5898\") " pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:10:33.298887 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.298862 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:10:33.298887 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.298881 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:10:33.306139 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.306123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:33.306225 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.306178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjgfv" Apr 17 17:10:33.433464 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.433429 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjgfv"] Apr 17 17:10:33.435364 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:33.435338 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd72e65_31b5_4a4a_acf8_3800bb1d5898.slice/crio-752ec2dc8af44cf7be3e12a5dc968a6d17b59c8fda67cc85bdde98af315c12c4 WatchSource:0}: Error finding container 752ec2dc8af44cf7be3e12a5dc968a6d17b59c8fda67cc85bdde98af315c12c4: Status 404 returned error can't find the container with id 752ec2dc8af44cf7be3e12a5dc968a6d17b59c8fda67cc85bdde98af315c12c4 Apr 17 17:10:33.452334 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.452278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77gh9"] Apr 17 17:10:33.455510 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:10:33.455487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05633129_a6c5_4b2a_9ddc_4a376e6b79c3.slice/crio-a3d93634553e84189ea84c8ef4b2cc77e830d0afdc2e84680030a2d29112bae6 WatchSource:0}: Error finding container a3d93634553e84189ea84c8ef4b2cc77e830d0afdc2e84680030a2d29112bae6: Status 404 returned error can't find the container with id a3d93634553e84189ea84c8ef4b2cc77e830d0afdc2e84680030a2d29112bae6 Apr 17 17:10:33.511917 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.511887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjgfv" event={"ID":"8cd72e65-31b5-4a4a-acf8-3800bb1d5898","Type":"ContainerStarted","Data":"752ec2dc8af44cf7be3e12a5dc968a6d17b59c8fda67cc85bdde98af315c12c4"} Apr 17 17:10:33.512762 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:33.512740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77gh9" event={"ID":"05633129-a6c5-4b2a-9ddc-4a376e6b79c3","Type":"ContainerStarted","Data":"a3d93634553e84189ea84c8ef4b2cc77e830d0afdc2e84680030a2d29112bae6"} Apr 17 17:10:35.523920 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.523877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjgfv" event={"ID":"8cd72e65-31b5-4a4a-acf8-3800bb1d5898","Type":"ContainerStarted","Data":"7c81e877e10b9e29c43b161c4ab7ede4ae36dc0e97155ba4f01a40545b21f573"} Apr 17 17:10:35.526541 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.526517 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77gh9" event={"ID":"05633129-a6c5-4b2a-9ddc-4a376e6b79c3","Type":"ContainerStarted","Data":"70d7f4605783b266811ef3f2c5f6ebbe2240b70402edb5970d1fe5980ee25b9b"} Apr 17 17:10:35.526541 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.526544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77gh9" event={"ID":"05633129-a6c5-4b2a-9ddc-4a376e6b79c3","Type":"ContainerStarted","Data":"7ea69f9304afcba7e5eb472c8abbdb2bb4c9e9ad5a754a60b480c15c3a8b0779"} Apr 17 17:10:35.526721 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.526671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:35.542417 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.542380 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tjgfv" podStartSLOduration=128.747434394 podStartE2EDuration="2m10.542368726s" podCreationTimestamp="2026-04-17 17:08:25 +0000 UTC" firstStartedPulling="2026-04-17 17:10:33.437472184 +0000 UTC m=+161.105155816" lastFinishedPulling="2026-04-17 17:10:35.232406506 +0000 UTC m=+162.900090148" observedRunningTime="2026-04-17 17:10:35.541253075 +0000 UTC m=+163.208936729" watchObservedRunningTime="2026-04-17 17:10:35.542368726 +0000 UTC m=+163.210052371" Apr 17 17:10:35.557169 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:35.557131 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-77gh9" podStartSLOduration=128.784509196 podStartE2EDuration="2m10.557119861s" podCreationTimestamp="2026-04-17 17:08:25 +0000 UTC" firstStartedPulling="2026-04-17 17:10:33.457182312 +0000 UTC m=+161.124865947" lastFinishedPulling="2026-04-17 17:10:35.22979298 +0000 UTC m=+162.897476612" observedRunningTime="2026-04-17 17:10:35.556938516 +0000 UTC m=+163.224622191" watchObservedRunningTime="2026-04-17 17:10:35.557119861 +0000 UTC m=+163.224803498" Apr 17 17:10:40.396073 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:40.395997 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77gh9_05633129-a6c5-4b2a-9ddc-4a376e6b79c3/dns/0.log" Apr 17 17:10:40.596427 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:40.596400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77gh9_05633129-a6c5-4b2a-9ddc-4a376e6b79c3/kube-rbac-proxy/0.log" Apr 17 17:10:41.595840 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:41.595811 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rxccf_c3fbb6b8-715e-4512-b7ce-584ff3fdf72e/dns-node-resolver/0.log" Apr 17 17:10:41.796227 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:41.796186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b86f88f8-qvj8p_394bfa3f-360e-4dbb-ab25-846a90b23983/router/0.log" Apr 17 17:10:42.195445 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:42.195418 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tjgfv_8cd72e65-31b5-4a4a-acf8-3800bb1d5898/serve-healthcheck-canary/0.log" Apr 17 17:10:45.532151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.532116 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-77gh9" Apr 17 17:10:45.560800 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.560743 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-546867ff6c-k7mp9" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerName="console" containerID="cri-o://39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97" gracePeriod=15 Apr 17 17:10:45.796473 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.795398 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546867ff6c-k7mp9_078e4a62-1e16-43bc-8318-2e2a520db01f/console/0.log" Apr 17 17:10:45.796473 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.795484 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:45.888589 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888558 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.888703 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888592 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.888746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888702 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.888746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888738 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z26dr\" (UniqueName: \"kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.888807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888759 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.888854 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888812 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca\") pod \"078e4a62-1e16-43bc-8318-2e2a520db01f\" (UID: \"078e4a62-1e16-43bc-8318-2e2a520db01f\") " Apr 17 17:10:45.889021 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.888993 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:45.889132 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.889114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config" (OuterVolumeSpecName: "console-config") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:45.889333 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.889308 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca" (OuterVolumeSpecName: "service-ca") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:45.890938 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.890913 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr" (OuterVolumeSpecName: "kube-api-access-z26dr") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "kube-api-access-z26dr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:45.891041 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.890919 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:45.891041 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.890983 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "078e4a62-1e16-43bc-8318-2e2a520db01f" (UID: "078e4a62-1e16-43bc-8318-2e2a520db01f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:45.989835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989806 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z26dr\" (UniqueName: \"kubernetes.io/projected/078e4a62-1e16-43bc-8318-2e2a520db01f-kube-api-access-z26dr\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:45.989835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989832 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-oauth-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:45.989977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989847 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-service-ca\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:45.989977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989860 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-oauth-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:45.989977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989874 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/078e4a62-1e16-43bc-8318-2e2a520db01f-console-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:45.989977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:45.989887 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/078e4a62-1e16-43bc-8318-2e2a520db01f-console-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:10:46.559693 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546867ff6c-k7mp9_078e4a62-1e16-43bc-8318-2e2a520db01f/console/0.log" Apr 17 17:10:46.560081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559713 2572 generic.go:358] "Generic (PLEG): container finished" podID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerID="39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97" exitCode=2 Apr 17 17:10:46.560081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559759 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546867ff6c-k7mp9" event={"ID":"078e4a62-1e16-43bc-8318-2e2a520db01f","Type":"ContainerDied","Data":"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97"} Apr 17 17:10:46.560081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559780 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546867ff6c-k7mp9" Apr 17 17:10:46.560081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546867ff6c-k7mp9" event={"ID":"078e4a62-1e16-43bc-8318-2e2a520db01f","Type":"ContainerDied","Data":"0100d628b5a2f623219b245b41b6f4953d196482e987a6ab1b06698a747897b4"} Apr 17 17:10:46.560081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.559810 2572 scope.go:117] "RemoveContainer" containerID="39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97" Apr 17 17:10:46.569161 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.569134 2572 scope.go:117] "RemoveContainer" containerID="39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97" Apr 17 17:10:46.569484 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:10:46.569448 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97\": container with ID starting with 39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97 not found: ID does not exist" containerID="39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97" Apr 17 17:10:46.569566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.569487 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97"} err="failed to get container status \"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97\": rpc error: code = NotFound desc = could not find container \"39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97\": container with ID starting with 39e0e894be6f4ccaa629950fe914407b4d27e259ed7468da373833910b012d97 not found: ID does not exist" Apr 17 17:10:46.582279 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.582257 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:46.588660 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.588632 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-546867ff6c-k7mp9"] Apr 17 17:10:46.950746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:46.950705 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" path="/var/lib/kubelet/pods/078e4a62-1e16-43bc-8318-2e2a520db01f/volumes" Apr 17 17:10:56.589804 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:56.589769 2572 generic.go:358] "Generic (PLEG): container finished" podID="03d537b0-8e43-49ac-aaf8-dc6d4576a650" containerID="4a122f8f43c8dab2b06dbedd2d55f3872d643b72887d2d64298f12d893e953c4" exitCode=0 Apr 17 17:10:56.590178 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:56.589840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" event={"ID":"03d537b0-8e43-49ac-aaf8-dc6d4576a650","Type":"ContainerDied","Data":"4a122f8f43c8dab2b06dbedd2d55f3872d643b72887d2d64298f12d893e953c4"} Apr 17 17:10:56.590178 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:56.590110 2572 scope.go:117] "RemoveContainer" containerID="4a122f8f43c8dab2b06dbedd2d55f3872d643b72887d2d64298f12d893e953c4" Apr 17 17:10:57.594722 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:10:57.594689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4wfsm" event={"ID":"03d537b0-8e43-49ac-aaf8-dc6d4576a650","Type":"ContainerStarted","Data":"24a8180c576e68790ae3dbcf00474996cd231e0f35e07b47e4c1fa3f6d920940"} Apr 17 17:11:03.614399 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:03.614318 2572 generic.go:358] "Generic (PLEG): container finished" podID="7564c7de-af3f-4337-99b0-e2adc3471cf9" containerID="df02dfe81b9b0bcbd17eda46484da07381cb43e19076b47c6a601cc7006e92e5" exitCode=0 Apr 17 17:11:03.614807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:03.614399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" event={"ID":"7564c7de-af3f-4337-99b0-e2adc3471cf9","Type":"ContainerDied","Data":"df02dfe81b9b0bcbd17eda46484da07381cb43e19076b47c6a601cc7006e92e5"} Apr 17 17:11:03.614807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:03.614727 2572 scope.go:117] "RemoveContainer" containerID="df02dfe81b9b0bcbd17eda46484da07381cb43e19076b47c6a601cc7006e92e5" Apr 17 17:11:04.620801 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:04.620766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kz554" event={"ID":"7564c7de-af3f-4337-99b0-e2adc3471cf9","Type":"ContainerStarted","Data":"16666bedcc433722b69d2e135671d9c570a282a3ac45c7d2fd1fa2be02240fcc"} Apr 17 17:11:43.931838 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.931775 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:11:43.932457 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.932255 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerName="console" Apr 17 17:11:43.932457 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.932275 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerName="console" Apr 17 17:11:43.932457 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.932350 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="078e4a62-1e16-43bc-8318-2e2a520db01f" containerName="console" Apr 17 17:11:43.934314 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.934293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.944859 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.944833 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:11:43.988569 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988879 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:43.988879 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:43.988760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56sw\" (UniqueName: \"kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089668 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089668 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089883 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089883 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089883 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089883 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.089883 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.089775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k56sw\" (UniqueName: \"kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.090485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.090452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.090608 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.090500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.090672 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.090607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.090846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.090819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.092316 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.092289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.092407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.092383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.097780 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.097745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56sw\" (UniqueName: \"kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw\") pod \"console-5786dc8478-6nbr6\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.247400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.247331 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:44.363200 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.363173 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:11:44.365717 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:11:44.365690 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f66386_d1d3_4169_b4cc_f50d2ada18c7.slice/crio-4d2289d936064387c5bd9be590df681d6aa83e7b794dd15ee612d2bfe3927129 WatchSource:0}: Error finding container 4d2289d936064387c5bd9be590df681d6aa83e7b794dd15ee612d2bfe3927129: Status 404 returned error can't find the container with id 4d2289d936064387c5bd9be590df681d6aa83e7b794dd15ee612d2bfe3927129 Apr 17 17:11:44.738842 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.738803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5786dc8478-6nbr6" event={"ID":"33f66386-d1d3-4169-b4cc-f50d2ada18c7","Type":"ContainerStarted","Data":"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160"} Apr 17 17:11:44.738992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.738846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5786dc8478-6nbr6" event={"ID":"33f66386-d1d3-4169-b4cc-f50d2ada18c7","Type":"ContainerStarted","Data":"4d2289d936064387c5bd9be590df681d6aa83e7b794dd15ee612d2bfe3927129"} Apr 17 17:11:44.758131 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:44.758083 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5786dc8478-6nbr6" podStartSLOduration=1.7580699119999998 podStartE2EDuration="1.758069912s" podCreationTimestamp="2026-04-17 17:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:11:44.75708723 +0000 UTC m=+232.424770885" watchObservedRunningTime="2026-04-17 17:11:44.758069912 +0000 UTC m=+232.425753566" Apr 17 17:11:54.248363 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:54.248327 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:54.248831 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:54.248379 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:54.253080 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:54.253056 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:54.772164 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:54.772132 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:11:54.860525 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:11:54.860494 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:12:19.878757 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:19.878647 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-559bf5db75-cbrwz" podUID="d624f9ab-15cd-408f-9a39-dbcd09170944" containerName="console" containerID="cri-o://946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26" gracePeriod=15 Apr 17 17:12:20.108527 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.108499 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-559bf5db75-cbrwz_d624f9ab-15cd-408f-9a39-dbcd09170944/console/0.log" Apr 17 17:12:20.108644 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.108577 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:12:20.145712 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145651 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.145712 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145684 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.145846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145721 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.145846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145756 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.145846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145791 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.145846 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145831 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.146032 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.145868 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glhh2\" (UniqueName: \"kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2\") pod \"d624f9ab-15cd-408f-9a39-dbcd09170944\" (UID: \"d624f9ab-15cd-408f-9a39-dbcd09170944\") " Apr 17 17:12:20.146270 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.146223 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca" (OuterVolumeSpecName: "service-ca") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:12:20.146270 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.146247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config" (OuterVolumeSpecName: "console-config") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:12:20.146550 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.146266 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:12:20.146550 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.146275 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:12:20.147705 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.147677 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:12:20.147967 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.147941 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:12:20.148048 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.147962 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2" (OuterVolumeSpecName: "kube-api-access-glhh2") pod "d624f9ab-15cd-408f-9a39-dbcd09170944" (UID: "d624f9ab-15cd-408f-9a39-dbcd09170944"). InnerVolumeSpecName "kube-api-access-glhh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:12:20.246552 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246516 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-service-ca\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246552 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246550 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246552 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246560 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-trusted-ca-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246717 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246569 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-oauth-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246717 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246577 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glhh2\" (UniqueName: \"kubernetes.io/projected/d624f9ab-15cd-408f-9a39-dbcd09170944-kube-api-access-glhh2\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246717 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246586 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d624f9ab-15cd-408f-9a39-dbcd09170944-console-oauth-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.246717 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.246596 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d624f9ab-15cd-408f-9a39-dbcd09170944-console-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:12:20.844933 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.844908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-559bf5db75-cbrwz_d624f9ab-15cd-408f-9a39-dbcd09170944/console/0.log" Apr 17 17:12:20.845101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.844947 2572 generic.go:358] "Generic (PLEG): container finished" podID="d624f9ab-15cd-408f-9a39-dbcd09170944" containerID="946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26" exitCode=2 Apr 17 17:12:20.845101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.844981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-559bf5db75-cbrwz" event={"ID":"d624f9ab-15cd-408f-9a39-dbcd09170944","Type":"ContainerDied","Data":"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26"} Apr 17 17:12:20.845101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.845014 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-559bf5db75-cbrwz" Apr 17 17:12:20.845101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.845026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-559bf5db75-cbrwz" event={"ID":"d624f9ab-15cd-408f-9a39-dbcd09170944","Type":"ContainerDied","Data":"c0c89afb97e4748e794ef6f525aa5f7ae744d5adcfc5371f54cad6baa4174735"} Apr 17 17:12:20.845101 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.845039 2572 scope.go:117] "RemoveContainer" containerID="946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26" Apr 17 17:12:20.853154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.853132 2572 scope.go:117] "RemoveContainer" containerID="946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26" Apr 17 17:12:20.853434 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:12:20.853415 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26\": container with ID starting with 946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26 not found: ID does not exist" containerID="946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26" Apr 17 17:12:20.853506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.853442 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26"} err="failed to get container status \"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26\": rpc error: code = NotFound desc = could not find container \"946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26\": container with ID starting with 946947714e4b50d81c227fb2c877722619823b58e62e291c53e18133acf2ad26 not found: ID does not exist" Apr 17 17:12:20.866068 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.866042 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:12:20.869481 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.869461 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-559bf5db75-cbrwz"] Apr 17 17:12:20.949053 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:20.949029 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d624f9ab-15cd-408f-9a39-dbcd09170944" path="/var/lib/kubelet/pods/d624f9ab-15cd-408f-9a39-dbcd09170944/volumes" Apr 17 17:12:52.824444 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:52.824414 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:12:52.827276 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:52.827221 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:12:52.831969 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:52.831941 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:12:53.268350 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.268320 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:12:53.269465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.268587 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d624f9ab-15cd-408f-9a39-dbcd09170944" containerName="console" Apr 17 17:12:53.269465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.268598 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d624f9ab-15cd-408f-9a39-dbcd09170944" containerName="console" Apr 17 17:12:53.269465 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.268651 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d624f9ab-15cd-408f-9a39-dbcd09170944" containerName="console" Apr 17 17:12:53.270569 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.270545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.280037 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.280015 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:12:53.367424 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf5lb\" (UniqueName: \"kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367635 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367635 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.367697 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.367636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468258 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf5lb\" (UniqueName: \"kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.468964 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.468934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.469123 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.469102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.469197 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.469102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.469565 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.469546 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.470686 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.470657 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.470802 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.470784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.476031 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.476011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf5lb\" (UniqueName: \"kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb\") pod \"console-6644598d65-p9rsv\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.580138 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.580041 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:12:53.695781 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.695751 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:12:53.698639 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:12:53.698613 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a464cdd_539a_4eb5_9ff8_9febb28a748e.slice/crio-b44ec9ff2ca9e5a04f9a2d87ef4d49448bbdd2e0f50981041fa2428a7d45bd7b WatchSource:0}: Error finding container b44ec9ff2ca9e5a04f9a2d87ef4d49448bbdd2e0f50981041fa2428a7d45bd7b: Status 404 returned error can't find the container with id b44ec9ff2ca9e5a04f9a2d87ef4d49448bbdd2e0f50981041fa2428a7d45bd7b Apr 17 17:12:53.937519 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.937490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6644598d65-p9rsv" event={"ID":"8a464cdd-539a-4eb5-9ff8-9febb28a748e","Type":"ContainerStarted","Data":"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5"} Apr 17 17:12:53.937912 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.937526 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6644598d65-p9rsv" event={"ID":"8a464cdd-539a-4eb5-9ff8-9febb28a748e","Type":"ContainerStarted","Data":"b44ec9ff2ca9e5a04f9a2d87ef4d49448bbdd2e0f50981041fa2428a7d45bd7b"} Apr 17 17:12:53.955000 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:12:53.954949 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6644598d65-p9rsv" podStartSLOduration=0.954932858 podStartE2EDuration="954.932858ms" podCreationTimestamp="2026-04-17 17:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:12:53.954335502 +0000 UTC m=+301.622019155" watchObservedRunningTime="2026-04-17 17:12:53.954932858 +0000 UTC m=+301.622616514" Apr 17 17:13:03.580330 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:03.580293 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:13:03.580330 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:03.580334 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:13:03.584903 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:03.584883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:13:03.969330 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:03.969301 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:13:04.015543 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:04.015504 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:13:29.035554 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.035521 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5786dc8478-6nbr6" podUID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" containerName="console" containerID="cri-o://ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160" gracePeriod=15 Apr 17 17:13:29.271177 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.271153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5786dc8478-6nbr6_33f66386-d1d3-4169-b4cc-f50d2ada18c7/console/0.log" Apr 17 17:13:29.271310 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.271239 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:13:29.412584 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412519 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412584 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412558 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412585 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56sw\" (UniqueName: \"kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412614 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412642 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412683 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca\") pod \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\" (UID: \"33f66386-d1d3-4169-b4cc-f50d2ada18c7\") " Apr 17 17:13:29.412977 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.412900 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config" (OuterVolumeSpecName: "console-config") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:29.413104 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.413068 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:29.413193 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.413146 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:29.413193 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.413166 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca" (OuterVolumeSpecName: "service-ca") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:29.414713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.414687 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:13:29.414791 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.414730 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:13:29.414791 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.414772 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw" (OuterVolumeSpecName: "kube-api-access-k56sw") pod "33f66386-d1d3-4169-b4cc-f50d2ada18c7" (UID: "33f66386-d1d3-4169-b4cc-f50d2ada18c7"). InnerVolumeSpecName "kube-api-access-k56sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:13:29.513414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513391 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-trusted-ca-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513414 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-service-ca\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513427 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513435 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513444 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k56sw\" (UniqueName: \"kubernetes.io/projected/33f66386-d1d3-4169-b4cc-f50d2ada18c7-kube-api-access-k56sw\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513453 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f66386-d1d3-4169-b4cc-f50d2ada18c7-oauth-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:29.513534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:29.513461 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f66386-d1d3-4169-b4cc-f50d2ada18c7-console-oauth-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:13:30.035933 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.035906 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5786dc8478-6nbr6_33f66386-d1d3-4169-b4cc-f50d2ada18c7/console/0.log" Apr 17 17:13:30.036303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.035949 2572 generic.go:358] "Generic (PLEG): container finished" podID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" containerID="ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160" exitCode=2 Apr 17 17:13:30.036303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.036037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5786dc8478-6nbr6" event={"ID":"33f66386-d1d3-4169-b4cc-f50d2ada18c7","Type":"ContainerDied","Data":"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160"} Apr 17 17:13:30.036303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.036065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5786dc8478-6nbr6" event={"ID":"33f66386-d1d3-4169-b4cc-f50d2ada18c7","Type":"ContainerDied","Data":"4d2289d936064387c5bd9be590df681d6aa83e7b794dd15ee612d2bfe3927129"} Apr 17 17:13:30.036303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.036082 2572 scope.go:117] "RemoveContainer" containerID="ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160" Apr 17 17:13:30.036303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.036044 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5786dc8478-6nbr6" Apr 17 17:13:30.045005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.044983 2572 scope.go:117] "RemoveContainer" containerID="ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160" Apr 17 17:13:30.045311 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:13:30.045290 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160\": container with ID starting with ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160 not found: ID does not exist" containerID="ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160" Apr 17 17:13:30.045369 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.045320 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160"} err="failed to get container status \"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160\": rpc error: code = NotFound desc = could not find container \"ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160\": container with ID starting with ad27326ed82faec7601f29d5ba678f7ab32bc3e64f281229971a656d0a400160 not found: ID does not exist" Apr 17 17:13:30.057062 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.057036 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:13:30.061133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.061110 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5786dc8478-6nbr6"] Apr 17 17:13:30.949303 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:30.949273 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" path="/var/lib/kubelet/pods/33f66386-d1d3-4169-b4cc-f50d2ada18c7/volumes" Apr 17 17:13:32.501472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.501435 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zw6zj"] Apr 17 17:13:32.501839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.501764 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" containerName="console" Apr 17 17:13:32.501839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.501778 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" containerName="console" Apr 17 17:13:32.501839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.501834 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="33f66386-d1d3-4169-b4cc-f50d2ada18c7" containerName="console" Apr 17 17:13:32.504680 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.504660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.507586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.507566 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:13:32.509630 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.509607 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zw6zj"] Apr 17 17:13:32.635154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.635130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f15ce690-dc22-41c5-a9f0-106b87ea9815-original-pull-secret\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.635282 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.635166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-dbus\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.635282 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.635240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-kubelet-config\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.735772 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.735741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-dbus\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.735857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.735787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-kubelet-config\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.735857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.735850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f15ce690-dc22-41c5-a9f0-106b87ea9815-original-pull-secret\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.735954 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.735934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-kubelet-config\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.735954 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.735946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f15ce690-dc22-41c5-a9f0-106b87ea9815-dbus\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.738107 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.738080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f15ce690-dc22-41c5-a9f0-106b87ea9815-original-pull-secret\") pod \"global-pull-secret-syncer-zw6zj\" (UID: \"f15ce690-dc22-41c5-a9f0-106b87ea9815\") " pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.814114 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.814065 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zw6zj" Apr 17 17:13:32.929603 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.929566 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zw6zj"] Apr 17 17:13:32.932714 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:13:32.932685 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf15ce690_dc22_41c5_a9f0_106b87ea9815.slice/crio-910f119bb03658ae7dec0df0ee47d3aecd95af6d4852cf97b820c503815ee857 WatchSource:0}: Error finding container 910f119bb03658ae7dec0df0ee47d3aecd95af6d4852cf97b820c503815ee857: Status 404 returned error can't find the container with id 910f119bb03658ae7dec0df0ee47d3aecd95af6d4852cf97b820c503815ee857 Apr 17 17:13:32.934082 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:32.934063 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:13:33.047167 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:33.047132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zw6zj" event={"ID":"f15ce690-dc22-41c5-a9f0-106b87ea9815","Type":"ContainerStarted","Data":"910f119bb03658ae7dec0df0ee47d3aecd95af6d4852cf97b820c503815ee857"} Apr 17 17:13:38.061749 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:38.061666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zw6zj" event={"ID":"f15ce690-dc22-41c5-a9f0-106b87ea9815","Type":"ContainerStarted","Data":"580d855499c5a965b6aeb4d655de25d6fb3ce7b0d519df8e5f27e76c6a0413e3"} Apr 17 17:13:38.077252 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:38.077186 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zw6zj" podStartSLOduration=1.242989314 podStartE2EDuration="6.077172586s" podCreationTimestamp="2026-04-17 17:13:32 +0000 UTC" firstStartedPulling="2026-04-17 17:13:32.934223334 +0000 UTC m=+340.601906981" lastFinishedPulling="2026-04-17 17:13:37.768406616 +0000 UTC m=+345.436090253" observedRunningTime="2026-04-17 17:13:38.076138077 +0000 UTC m=+345.743821730" watchObservedRunningTime="2026-04-17 17:13:38.077172586 +0000 UTC m=+345.744856279" Apr 17 17:13:44.813911 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.813869 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg"] Apr 17 17:13:44.816161 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.816141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:44.818534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.818503 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:13:44.818534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.818522 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:13:44.819511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.819489 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:13:44.825403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.825381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg"] Apr 17 17:13:44.921347 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.921322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:44.921462 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.921358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:44.921462 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:44.921381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24p7s\" (UniqueName: \"kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.022657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.022634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.022761 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.022669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.022761 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.022690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24p7s\" (UniqueName: \"kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.023099 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.023074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.023147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.023084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.030970 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.030947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24p7s\" (UniqueName: \"kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.126649 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.126592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:13:45.243386 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:45.243364 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg"] Apr 17 17:13:45.245971 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:13:45.245942 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31f015f_5a1a_4d16_a5b6_01a5f6db86f3.slice/crio-e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400 WatchSource:0}: Error finding container e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400: Status 404 returned error can't find the container with id e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400 Apr 17 17:13:46.084343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:46.084308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerStarted","Data":"e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400"} Apr 17 17:13:51.101615 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:51.101579 2572 generic.go:358] "Generic (PLEG): container finished" podID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerID="cb67c6c2b4cca5e092e39c5c037c00b98f4b03d4da9b9ea8a566c03830c8f21e" exitCode=0 Apr 17 17:13:51.101995 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:51.101668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerDied","Data":"cb67c6c2b4cca5e092e39c5c037c00b98f4b03d4da9b9ea8a566c03830c8f21e"} Apr 17 17:13:53.109094 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:53.109028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerStarted","Data":"ff5d88ce68724e400bf1aa4d98eb8d1de8dbb351afa1ee874a3ae0fc90d4c24d"} Apr 17 17:13:54.118222 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:54.118170 2572 generic.go:358] "Generic (PLEG): container finished" podID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerID="ff5d88ce68724e400bf1aa4d98eb8d1de8dbb351afa1ee874a3ae0fc90d4c24d" exitCode=0 Apr 17 17:13:54.118647 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:13:54.118261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerDied","Data":"ff5d88ce68724e400bf1aa4d98eb8d1de8dbb351afa1ee874a3ae0fc90d4c24d"} Apr 17 17:14:01.138872 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:01.138791 2572 generic.go:358] "Generic (PLEG): container finished" podID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerID="5d4f2504e74a5be92dff6589e331c9e911a234ddf44f729cd904eae7d5033775" exitCode=0 Apr 17 17:14:01.138872 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:01.138863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerDied","Data":"5d4f2504e74a5be92dff6589e331c9e911a234ddf44f729cd904eae7d5033775"} Apr 17 17:14:02.258563 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.258540 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:14:02.348679 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.348651 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle\") pod \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " Apr 17 17:14:02.348799 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.348704 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util\") pod \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " Apr 17 17:14:02.348799 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.348726 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24p7s\" (UniqueName: \"kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s\") pod \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\" (UID: \"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3\") " Apr 17 17:14:02.349178 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.349144 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle" (OuterVolumeSpecName: "bundle") pod "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" (UID: "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:02.350809 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.350783 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s" (OuterVolumeSpecName: "kube-api-access-24p7s") pod "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" (UID: "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3"). InnerVolumeSpecName "kube-api-access-24p7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:14:02.352917 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.352895 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util" (OuterVolumeSpecName: "util") pod "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" (UID: "c31f015f-5a1a-4d16-a5b6-01a5f6db86f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:02.449903 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.449881 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:02.449903 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.449903 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24p7s\" (UniqueName: \"kubernetes.io/projected/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-kube-api-access-24p7s\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:02.450013 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:02.449913 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c31f015f-5a1a-4d16-a5b6-01a5f6db86f3-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:03.145952 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:03.145913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" event={"ID":"c31f015f-5a1a-4d16-a5b6-01a5f6db86f3","Type":"ContainerDied","Data":"e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400"} Apr 17 17:14:03.145952 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:03.145945 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e844855bc6bea7db7448b8b628581e33cd3322e5f4b64990cec6cb7f7abc6400" Apr 17 17:14:03.146134 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:03.145984 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rfqlg" Apr 17 17:14:07.233323 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233293 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz"] Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233568 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="extract" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233579 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="extract" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233593 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="util" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233598 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="util" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233605 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="pull" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233610 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="pull" Apr 17 17:14:07.233678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.233661 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c31f015f-5a1a-4d16-a5b6-01a5f6db86f3" containerName="extract" Apr 17 17:14:07.254454 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.254430 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz"] Apr 17 17:14:07.254592 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.254547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.258129 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.258102 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:14:07.258278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.258111 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-thdmx\"" Apr 17 17:14:07.258278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.258152 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 17:14:07.285983 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.285955 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlgc9\" (UniqueName: \"kubernetes.io/projected/02e08822-489b-44e7-a0a3-16fe907b16f2-kube-api-access-vlgc9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.286085 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.285990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02e08822-489b-44e7-a0a3-16fe907b16f2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.386360 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.386335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlgc9\" (UniqueName: \"kubernetes.io/projected/02e08822-489b-44e7-a0a3-16fe907b16f2-kube-api-access-vlgc9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.386454 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.386367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02e08822-489b-44e7-a0a3-16fe907b16f2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.386748 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.386731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02e08822-489b-44e7-a0a3-16fe907b16f2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.399564 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.399538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlgc9\" (UniqueName: \"kubernetes.io/projected/02e08822-489b-44e7-a0a3-16fe907b16f2-kube-api-access-vlgc9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-7k6hz\" (UID: \"02e08822-489b-44e7-a0a3-16fe907b16f2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.563732 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.563670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" Apr 17 17:14:07.692292 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:07.692176 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz"] Apr 17 17:14:07.695174 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:14:07.695147 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e08822_489b_44e7_a0a3_16fe907b16f2.slice/crio-b5c907d404321473b7e31968b9579c814133919f3529d90e4fdf52107706a9ce WatchSource:0}: Error finding container b5c907d404321473b7e31968b9579c814133919f3529d90e4fdf52107706a9ce: Status 404 returned error can't find the container with id b5c907d404321473b7e31968b9579c814133919f3529d90e4fdf52107706a9ce Apr 17 17:14:08.161562 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:08.161531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" event={"ID":"02e08822-489b-44e7-a0a3-16fe907b16f2","Type":"ContainerStarted","Data":"b5c907d404321473b7e31968b9579c814133919f3529d90e4fdf52107706a9ce"} Apr 17 17:14:12.177859 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:12.177065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" event={"ID":"02e08822-489b-44e7-a0a3-16fe907b16f2","Type":"ContainerStarted","Data":"5fb8db236ebad4ffebeb038b4be950938188532d65e80b1539b71d34c846b7a0"} Apr 17 17:14:12.201202 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:12.201138 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-7k6hz" podStartSLOduration=1.7616035829999999 podStartE2EDuration="5.201120574s" podCreationTimestamp="2026-04-17 17:14:07 +0000 UTC" firstStartedPulling="2026-04-17 17:14:07.697656699 +0000 UTC m=+375.365340331" lastFinishedPulling="2026-04-17 17:14:11.137173677 +0000 UTC m=+378.804857322" observedRunningTime="2026-04-17 17:14:12.199357636 +0000 UTC m=+379.867041287" watchObservedRunningTime="2026-04-17 17:14:12.201120574 +0000 UTC m=+379.868804229" Apr 17 17:14:14.012653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.012608 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6"] Apr 17 17:14:14.038641 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.038615 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6"] Apr 17 17:14:14.038766 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.038723 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.041279 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.041259 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:14:14.042183 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.042167 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:14:14.042267 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.042248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:14:14.138622 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.138596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbh28\" (UniqueName: \"kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.138733 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.138645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.138733 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.138726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.239578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.239548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.239678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.239597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbh28\" (UniqueName: \"kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.239678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.239632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.239954 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.239935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.240030 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.239999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.248105 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.248077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbh28\" (UniqueName: \"kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.347319 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.347263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:14.666308 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:14.666282 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6"] Apr 17 17:14:14.668790 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:14:14.668758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602390af_8394_46ed_bdea_617c5bdddaed.slice/crio-e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06 WatchSource:0}: Error finding container e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06: Status 404 returned error can't find the container with id e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06 Apr 17 17:14:15.185935 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:15.185898 2572 generic.go:358] "Generic (PLEG): container finished" podID="602390af-8394-46ed-bdea-617c5bdddaed" containerID="55af73bd8ec6c13bb26913f9b987f7de0c5d691ecf801def6ea4ea87f356c1fe" exitCode=0 Apr 17 17:14:15.186335 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:15.185940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" event={"ID":"602390af-8394-46ed-bdea-617c5bdddaed","Type":"ContainerDied","Data":"55af73bd8ec6c13bb26913f9b987f7de0c5d691ecf801def6ea4ea87f356c1fe"} Apr 17 17:14:15.186335 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:15.185960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" event={"ID":"602390af-8394-46ed-bdea-617c5bdddaed","Type":"ContainerStarted","Data":"e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06"} Apr 17 17:14:17.583180 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.583114 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-clh22"] Apr 17 17:14:17.586294 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.586277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.588701 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.588676 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:14:17.588820 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.588799 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-6bvjj\"" Apr 17 17:14:17.589687 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.589670 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:14:17.594228 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.594191 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-clh22"] Apr 17 17:14:17.667320 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.667292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhbbl\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-kube-api-access-rhbbl\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.667416 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.667338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.768322 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.768296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhbbl\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-kube-api-access-rhbbl\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.768411 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.768332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.776374 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.776354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.776637 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.776618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhbbl\" (UniqueName: \"kubernetes.io/projected/238dda93-a0a1-4a38-bb28-d6db31a71fb5-kube-api-access-rhbbl\") pod \"cert-manager-cainjector-8966b78d4-clh22\" (UID: \"238dda93-a0a1-4a38-bb28-d6db31a71fb5\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:17.911051 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:17.911002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" Apr 17 17:14:18.023238 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:18.023195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-clh22"] Apr 17 17:14:18.025499 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:14:18.025475 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod238dda93_a0a1_4a38_bb28_d6db31a71fb5.slice/crio-ffc601da0da4ab73d1c01df95d211cd55cd0c5227c37d60cc41880e12f4d5b84 WatchSource:0}: Error finding container ffc601da0da4ab73d1c01df95d211cd55cd0c5227c37d60cc41880e12f4d5b84: Status 404 returned error can't find the container with id ffc601da0da4ab73d1c01df95d211cd55cd0c5227c37d60cc41880e12f4d5b84 Apr 17 17:14:18.197178 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:18.197151 2572 generic.go:358] "Generic (PLEG): container finished" podID="602390af-8394-46ed-bdea-617c5bdddaed" containerID="5fe22e77caba447561a75ced77b9d1df54b8f493a5d02b4412355905b8a3a26f" exitCode=0 Apr 17 17:14:18.197331 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:18.197246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" event={"ID":"602390af-8394-46ed-bdea-617c5bdddaed","Type":"ContainerDied","Data":"5fe22e77caba447561a75ced77b9d1df54b8f493a5d02b4412355905b8a3a26f"} Apr 17 17:14:18.198356 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:18.198333 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" event={"ID":"238dda93-a0a1-4a38-bb28-d6db31a71fb5","Type":"ContainerStarted","Data":"ffc601da0da4ab73d1c01df95d211cd55cd0c5227c37d60cc41880e12f4d5b84"} Apr 17 17:14:19.206080 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:19.206034 2572 generic.go:358] "Generic (PLEG): container finished" podID="602390af-8394-46ed-bdea-617c5bdddaed" containerID="560da6e765c8e56a6130d9eeb3b462560a12f3cf7c3e5d99c0942e750a4cbed8" exitCode=0 Apr 17 17:14:19.206535 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:19.206101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" event={"ID":"602390af-8394-46ed-bdea-617c5bdddaed","Type":"ContainerDied","Data":"560da6e765c8e56a6130d9eeb3b462560a12f3cf7c3e5d99c0942e750a4cbed8"} Apr 17 17:14:20.769322 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.769300 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:20.894783 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.894757 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbh28\" (UniqueName: \"kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28\") pod \"602390af-8394-46ed-bdea-617c5bdddaed\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " Apr 17 17:14:20.894892 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.894811 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util\") pod \"602390af-8394-46ed-bdea-617c5bdddaed\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " Apr 17 17:14:20.894892 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.894866 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle\") pod \"602390af-8394-46ed-bdea-617c5bdddaed\" (UID: \"602390af-8394-46ed-bdea-617c5bdddaed\") " Apr 17 17:14:20.895315 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.895288 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle" (OuterVolumeSpecName: "bundle") pod "602390af-8394-46ed-bdea-617c5bdddaed" (UID: "602390af-8394-46ed-bdea-617c5bdddaed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:20.896701 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.896679 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28" (OuterVolumeSpecName: "kube-api-access-tbh28") pod "602390af-8394-46ed-bdea-617c5bdddaed" (UID: "602390af-8394-46ed-bdea-617c5bdddaed"). InnerVolumeSpecName "kube-api-access-tbh28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:14:20.901127 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.901100 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util" (OuterVolumeSpecName: "util") pod "602390af-8394-46ed-bdea-617c5bdddaed" (UID: "602390af-8394-46ed-bdea-617c5bdddaed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:20.995969 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.995945 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbh28\" (UniqueName: \"kubernetes.io/projected/602390af-8394-46ed-bdea-617c5bdddaed-kube-api-access-tbh28\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:20.995969 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.995969 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:20.996104 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:20.995979 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602390af-8394-46ed-bdea-617c5bdddaed-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:21.214192 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:21.214157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" event={"ID":"602390af-8394-46ed-bdea-617c5bdddaed","Type":"ContainerDied","Data":"e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06"} Apr 17 17:14:21.214192 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:21.214188 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ce5e8b977e9f094f78c64c2d9b48b0aa6bd957ae19148068d45f417c4c9f06" Apr 17 17:14:21.214192 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:21.214187 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rxp6" Apr 17 17:14:21.215657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:21.215616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" event={"ID":"238dda93-a0a1-4a38-bb28-d6db31a71fb5","Type":"ContainerStarted","Data":"a58e29c4952e4de0906d3b28b3799af58e8b89f0026fe7e37438f5b2735b8116"} Apr 17 17:14:21.231340 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:21.231298 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-clh22" podStartSLOduration=1.437091224 podStartE2EDuration="4.231287021s" podCreationTimestamp="2026-04-17 17:14:17 +0000 UTC" firstStartedPulling="2026-04-17 17:14:18.027349109 +0000 UTC m=+385.695032745" lastFinishedPulling="2026-04-17 17:14:20.821544897 +0000 UTC m=+388.489228542" observedRunningTime="2026-04-17 17:14:21.229525481 +0000 UTC m=+388.897209117" watchObservedRunningTime="2026-04-17 17:14:21.231287021 +0000 UTC m=+388.898970675" Apr 17 17:14:32.878360 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878322 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-7nfnk"] Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878590 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="extract" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878601 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="extract" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878620 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="util" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878625 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="util" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878631 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="pull" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878637 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="pull" Apr 17 17:14:32.878811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.878702 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="602390af-8394-46ed-bdea-617c5bdddaed" containerName="extract" Apr 17 17:14:32.881699 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.881680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:32.884000 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.883984 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-h7v95\"" Apr 17 17:14:32.890266 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.890169 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-7nfnk"] Apr 17 17:14:32.981790 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.981757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66wsc\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-kube-api-access-66wsc\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:32.981933 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:32.981797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-bound-sa-token\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.082596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.082568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66wsc\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-kube-api-access-66wsc\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.082704 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.082605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-bound-sa-token\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.093133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.093099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-bound-sa-token\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.093335 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.093315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66wsc\" (UniqueName: \"kubernetes.io/projected/c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8-kube-api-access-66wsc\") pod \"cert-manager-759f64656b-7nfnk\" (UID: \"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8\") " pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.192940 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.192913 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-7nfnk" Apr 17 17:14:33.309290 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:33.309268 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-7nfnk"] Apr 17 17:14:33.312519 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:14:33.312488 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e49ee8_a3cd_438f_ad55_3c11c1bbd0e8.slice/crio-766bec1cf6cdd0f6228e97e999573b8b9730200287cf33ff518da8ac0782e344 WatchSource:0}: Error finding container 766bec1cf6cdd0f6228e97e999573b8b9730200287cf33ff518da8ac0782e344: Status 404 returned error can't find the container with id 766bec1cf6cdd0f6228e97e999573b8b9730200287cf33ff518da8ac0782e344 Apr 17 17:14:34.260913 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:34.260866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-7nfnk" event={"ID":"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8","Type":"ContainerStarted","Data":"121326cc2120c9fae0e70c968929bf8268ac0855b15ce67fc946c829544867ee"} Apr 17 17:14:34.260913 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:34.260918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-7nfnk" event={"ID":"c3e49ee8-a3cd-438f-ad55-3c11c1bbd0e8","Type":"ContainerStarted","Data":"766bec1cf6cdd0f6228e97e999573b8b9730200287cf33ff518da8ac0782e344"} Apr 17 17:14:34.276461 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:34.276409 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-7nfnk" podStartSLOduration=2.2763882779999998 podStartE2EDuration="2.276388278s" podCreationTimestamp="2026-04-17 17:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:14:34.275842807 +0000 UTC m=+401.943526455" watchObservedRunningTime="2026-04-17 17:14:34.276388278 +0000 UTC m=+401.944071933" Apr 17 17:14:53.386524 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.386495 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5"] Apr 17 17:14:53.393293 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.393276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.396285 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.396258 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:14:53.396285 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.396279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:14:53.396792 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.396771 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5"] Apr 17 17:14:53.397238 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.397201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:14:53.531482 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.531455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.531576 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.531498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxg8\" (UniqueName: \"kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.531576 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.531573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.632571 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.632548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.632668 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.632585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxg8\" (UniqueName: \"kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.632668 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.632616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.632912 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.632892 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.632971 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.632912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.643750 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.643696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxg8\" (UniqueName: \"kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.704592 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.704570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:53.826242 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:53.826201 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5"] Apr 17 17:14:53.828022 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:14:53.827992 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f99dda6_1383_45df_94ee_a68fba203c6e.slice/crio-863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61 WatchSource:0}: Error finding container 863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61: Status 404 returned error can't find the container with id 863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61 Apr 17 17:14:54.329815 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:54.329779 2572 generic.go:358] "Generic (PLEG): container finished" podID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerID="21cdb688832a4033f9ea3d4437e1ea6453b3648c65b377bd5a241abb0dd5f4b6" exitCode=0 Apr 17 17:14:54.330039 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:54.329862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" event={"ID":"6f99dda6-1383-45df-94ee-a68fba203c6e","Type":"ContainerDied","Data":"21cdb688832a4033f9ea3d4437e1ea6453b3648c65b377bd5a241abb0dd5f4b6"} Apr 17 17:14:54.330039 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:54.329901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" event={"ID":"6f99dda6-1383-45df-94ee-a68fba203c6e","Type":"ContainerStarted","Data":"863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61"} Apr 17 17:14:55.335312 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:55.335244 2572 generic.go:358] "Generic (PLEG): container finished" podID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerID="98a6b46cf958215d071e032ac0bf8d6e56bed5fc077b187fe76ee67ba3d5bbcb" exitCode=0 Apr 17 17:14:55.335627 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:55.335326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" event={"ID":"6f99dda6-1383-45df-94ee-a68fba203c6e","Type":"ContainerDied","Data":"98a6b46cf958215d071e032ac0bf8d6e56bed5fc077b187fe76ee67ba3d5bbcb"} Apr 17 17:14:56.341127 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:56.341082 2572 generic.go:358] "Generic (PLEG): container finished" podID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerID="f7614525d40a815defdb78176adb4e5273cb1da6eb2f80acf3f7bddf87b66269" exitCode=0 Apr 17 17:14:56.341532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:56.341129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" event={"ID":"6f99dda6-1383-45df-94ee-a68fba203c6e","Type":"ContainerDied","Data":"f7614525d40a815defdb78176adb4e5273cb1da6eb2f80acf3f7bddf87b66269"} Apr 17 17:14:57.466497 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.466473 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:14:57.560718 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.560691 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util\") pod \"6f99dda6-1383-45df-94ee-a68fba203c6e\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " Apr 17 17:14:57.560835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.560734 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle\") pod \"6f99dda6-1383-45df-94ee-a68fba203c6e\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " Apr 17 17:14:57.560835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.560790 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxg8\" (UniqueName: \"kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8\") pod \"6f99dda6-1383-45df-94ee-a68fba203c6e\" (UID: \"6f99dda6-1383-45df-94ee-a68fba203c6e\") " Apr 17 17:14:57.561462 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.561434 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle" (OuterVolumeSpecName: "bundle") pod "6f99dda6-1383-45df-94ee-a68fba203c6e" (UID: "6f99dda6-1383-45df-94ee-a68fba203c6e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:57.562822 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.562796 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8" (OuterVolumeSpecName: "kube-api-access-2mxg8") pod "6f99dda6-1383-45df-94ee-a68fba203c6e" (UID: "6f99dda6-1383-45df-94ee-a68fba203c6e"). InnerVolumeSpecName "kube-api-access-2mxg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:14:57.566181 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.566163 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util" (OuterVolumeSpecName: "util") pod "6f99dda6-1383-45df-94ee-a68fba203c6e" (UID: "6f99dda6-1383-45df-94ee-a68fba203c6e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:14:57.661576 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.661529 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:57.661576 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.661548 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f99dda6-1383-45df-94ee-a68fba203c6e-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:57.661576 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:57.661557 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mxg8\" (UniqueName: \"kubernetes.io/projected/6f99dda6-1383-45df-94ee-a68fba203c6e-kube-api-access-2mxg8\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:14:58.349036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:58.349004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" event={"ID":"6f99dda6-1383-45df-94ee-a68fba203c6e","Type":"ContainerDied","Data":"863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61"} Apr 17 17:14:58.349036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:58.349037 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="863a06a22172fcfacbf8369723ab2c6afaff4e09d000393d7a794d46da726a61" Apr 17 17:14:58.349298 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:14:58.349015 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5b2lm5" Apr 17 17:15:07.562034 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.561955 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq"] Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562239 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="pull" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562250 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="pull" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562266 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="util" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562271 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="util" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562278 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="extract" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562283 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="extract" Apr 17 17:15:07.562415 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.562351 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f99dda6-1383-45df-94ee-a68fba203c6e" containerName="extract" Apr 17 17:15:07.565169 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.565152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.567906 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.567883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:15:07.568070 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.567959 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:15:07.568670 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.568656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:15:07.573995 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.573972 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq"] Apr 17 17:15:07.730314 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.730287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9fp\" (UniqueName: \"kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.730484 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.730334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.730484 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.730354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.831414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.831337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.831414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.831373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.831608 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.831458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9fp\" (UniqueName: \"kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.831722 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.831701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.831782 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.831729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.839553 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.839530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9fp\" (UniqueName: \"kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.874963 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.874937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:07.999141 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:07.999115 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq"] Apr 17 17:15:08.001403 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:08.001378 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa187a0_22b4_402f_8c0c_fd4eb2c9cbc7.slice/crio-0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47 WatchSource:0}: Error finding container 0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47: Status 404 returned error can't find the container with id 0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47 Apr 17 17:15:08.382863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:08.382790 2572 generic.go:358] "Generic (PLEG): container finished" podID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerID="360994598b5664a17c7aeb906bfb4392683d5db47949bc2e1ec73a629a847c90" exitCode=0 Apr 17 17:15:08.382863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:08.382828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" event={"ID":"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7","Type":"ContainerDied","Data":"360994598b5664a17c7aeb906bfb4392683d5db47949bc2e1ec73a629a847c90"} Apr 17 17:15:08.382863 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:08.382851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" event={"ID":"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7","Type":"ContainerStarted","Data":"0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47"} Apr 17 17:15:09.387984 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.387911 2572 generic.go:358] "Generic (PLEG): container finished" podID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerID="04c3f8c32af0aef4e2df0a6471e5f0e27ee7e1c0479750a2da0cb5f920f6ba18" exitCode=0 Apr 17 17:15:09.388412 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.387990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" event={"ID":"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7","Type":"ContainerDied","Data":"04c3f8c32af0aef4e2df0a6471e5f0e27ee7e1c0479750a2da0cb5f920f6ba18"} Apr 17 17:15:09.543951 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.543929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw"] Apr 17 17:15:09.547239 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.547221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.552563 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.552545 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:15:09.552662 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.552644 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:15:09.552715 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.552698 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:15:09.553692 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.553676 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:15:09.554225 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.554199 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-m57m4\"" Apr 17 17:15:09.589621 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.587655 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw"] Apr 17 17:15:09.644221 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.644151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.644221 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.644189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.644352 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.644274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvmd\" (UniqueName: \"kubernetes.io/projected/af5524b1-ba23-4941-96f9-faddbb864aa7-kube-api-access-lgvmd\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.745529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.745502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.745617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.745545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvmd\" (UniqueName: \"kubernetes.io/projected/af5524b1-ba23-4941-96f9-faddbb864aa7-kube-api-access-lgvmd\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.745617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.745587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.747824 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.747802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.747900 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.747838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af5524b1-ba23-4941-96f9-faddbb864aa7-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.754464 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.754443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvmd\" (UniqueName: \"kubernetes.io/projected/af5524b1-ba23-4941-96f9-faddbb864aa7-kube-api-access-lgvmd\") pod \"opendatahub-operator-controller-manager-6569445fb5-7qtnw\" (UID: \"af5524b1-ba23-4941-96f9-faddbb864aa7\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:09.891692 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:09.891670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:10.023006 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:10.022979 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw"] Apr 17 17:15:10.024797 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:10.024773 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf5524b1_ba23_4941_96f9_faddbb864aa7.slice/crio-5f0af5aa9f16667af9948fa8418b7ef5f9ee2e2dee15882d7907c60eef0262a3 WatchSource:0}: Error finding container 5f0af5aa9f16667af9948fa8418b7ef5f9ee2e2dee15882d7907c60eef0262a3: Status 404 returned error can't find the container with id 5f0af5aa9f16667af9948fa8418b7ef5f9ee2e2dee15882d7907c60eef0262a3 Apr 17 17:15:10.393270 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:10.393239 2572 generic.go:358] "Generic (PLEG): container finished" podID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerID="2f8bba574d1499569ab4171056c19cba2f4e06c128b6e08103b2bf28eb957297" exitCode=0 Apr 17 17:15:10.393707 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:10.393310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" event={"ID":"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7","Type":"ContainerDied","Data":"2f8bba574d1499569ab4171056c19cba2f4e06c128b6e08103b2bf28eb957297"} Apr 17 17:15:10.394369 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:10.394353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" event={"ID":"af5524b1-ba23-4941-96f9-faddbb864aa7","Type":"ContainerStarted","Data":"5f0af5aa9f16667af9948fa8418b7ef5f9ee2e2dee15882d7907c60eef0262a3"} Apr 17 17:15:12.425004 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.424982 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:12.569355 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.569331 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util\") pod \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " Apr 17 17:15:12.569493 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.569383 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9fp\" (UniqueName: \"kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp\") pod \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " Apr 17 17:15:12.569493 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.569421 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle\") pod \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\" (UID: \"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7\") " Apr 17 17:15:12.570119 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.570093 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle" (OuterVolumeSpecName: "bundle") pod "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" (UID: "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:12.571428 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.571406 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp" (OuterVolumeSpecName: "kube-api-access-6r9fp") pod "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" (UID: "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7"). InnerVolumeSpecName "kube-api-access-6r9fp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:15:12.574509 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.574485 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util" (OuterVolumeSpecName: "util") pod "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" (UID: "baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:12.670769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.670711 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:12.670769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.670736 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6r9fp\" (UniqueName: \"kubernetes.io/projected/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-kube-api-access-6r9fp\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:12.670769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:12.670746 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:13.407295 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.407191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" event={"ID":"baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7","Type":"ContainerDied","Data":"0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47"} Apr 17 17:15:13.407295 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.407251 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d445d696fd8b560670416c7d941d34bdb759f8758db0582966dfc0fb3a93d47" Apr 17 17:15:13.407295 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.407228 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9lpdpq" Apr 17 17:15:13.408593 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.408561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" event={"ID":"af5524b1-ba23-4941-96f9-faddbb864aa7","Type":"ContainerStarted","Data":"acac6c174d543e974d103bb597a5d0ec87f65c32a2e8e302f3f2746ac50534cd"} Apr 17 17:15:13.408743 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.408722 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:13.428680 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:13.428641 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" podStartSLOduration=1.998958611 podStartE2EDuration="4.428632433s" podCreationTimestamp="2026-04-17 17:15:09 +0000 UTC" firstStartedPulling="2026-04-17 17:15:10.027161471 +0000 UTC m=+437.694845104" lastFinishedPulling="2026-04-17 17:15:12.456835295 +0000 UTC m=+440.124518926" observedRunningTime="2026-04-17 17:15:13.427247196 +0000 UTC m=+441.094930850" watchObservedRunningTime="2026-04-17 17:15:13.428632433 +0000 UTC m=+441.096316086" Apr 17 17:15:24.414566 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:24.414533 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-7qtnw" Apr 17 17:15:26.591053 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591021 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87"] Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591319 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="pull" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591331 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="pull" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591338 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="extract" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591344 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="extract" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591361 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="util" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591367 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="util" Apr 17 17:15:26.591506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.591413 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="baa187a0-22b4-402f-8c0c-fd4eb2c9cbc7" containerName="extract" Apr 17 17:15:26.595647 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.595629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.598312 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.598283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:15:26.598312 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.598298 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:15:26.598312 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.598283 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:15:26.606723 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.606693 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87"] Apr 17 17:15:26.671875 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.671847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5k78\" (UniqueName: \"kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.671980 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.671890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.672021 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.671996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.772895 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.772861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.773008 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.772921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.773008 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.772965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5k78\" (UniqueName: \"kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.773326 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.773309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.773398 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.773334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.786713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.786650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5k78\" (UniqueName: \"kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:26.904641 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:26.904577 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:27.036218 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:27.036181 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87"] Apr 17 17:15:27.039317 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:27.039281 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b39ac81_0c1f_48ec_8cce_5d80f847e1db.slice/crio-51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20 WatchSource:0}: Error finding container 51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20: Status 404 returned error can't find the container with id 51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20 Apr 17 17:15:27.455895 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:27.455863 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerID="bd8e9eaaa338a11eda812f2122bbf592b6bf92109eade48c59c27d4dab868022" exitCode=0 Apr 17 17:15:27.456025 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:27.455926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" event={"ID":"3b39ac81-0c1f-48ec-8cce-5d80f847e1db","Type":"ContainerDied","Data":"bd8e9eaaa338a11eda812f2122bbf592b6bf92109eade48c59c27d4dab868022"} Apr 17 17:15:27.456025 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:27.455949 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" event={"ID":"3b39ac81-0c1f-48ec-8cce-5d80f847e1db","Type":"ContainerStarted","Data":"51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20"} Apr 17 17:15:28.299817 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.299784 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5"] Apr 17 17:15:28.303003 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.302983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.305355 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.305331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 17:15:28.305492 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.305362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-kp745\"" Apr 17 17:15:28.305492 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.305443 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 17:15:28.312644 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.312622 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5"] Apr 17 17:15:28.386657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.386587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tmp\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.386657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.386619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tls-certs\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.386657 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.386648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27w5\" (UniqueName: \"kubernetes.io/projected/7190378e-cdc3-4581-b7a0-d93b9cd31af8-kube-api-access-p27w5\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.460378 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.460344 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerID="239ffaf49ab57701b00ccf58ae971e03337aa5314d0c771c8c77226f6816120f" exitCode=0 Apr 17 17:15:28.460516 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.460435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" event={"ID":"3b39ac81-0c1f-48ec-8cce-5d80f847e1db","Type":"ContainerDied","Data":"239ffaf49ab57701b00ccf58ae971e03337aa5314d0c771c8c77226f6816120f"} Apr 17 17:15:28.487882 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.487857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tmp\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.487995 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.487886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tls-certs\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.487995 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.487915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p27w5\" (UniqueName: \"kubernetes.io/projected/7190378e-cdc3-4581-b7a0-d93b9cd31af8-kube-api-access-p27w5\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.489993 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.489973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tmp\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.490192 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.490176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7190378e-cdc3-4581-b7a0-d93b9cd31af8-tls-certs\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.495390 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.495372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27w5\" (UniqueName: \"kubernetes.io/projected/7190378e-cdc3-4581-b7a0-d93b9cd31af8-kube-api-access-p27w5\") pod \"kube-auth-proxy-db5457dbf-xgnb5\" (UID: \"7190378e-cdc3-4581-b7a0-d93b9cd31af8\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.612301 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.612270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" Apr 17 17:15:28.735170 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:28.735145 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5"] Apr 17 17:15:28.736827 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:28.736784 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7190378e_cdc3_4581_b7a0_d93b9cd31af8.slice/crio-a3536761d9d3c1d9fd7c3d7aeb20fcd9bec28578a05509685f828e8df787b467 WatchSource:0}: Error finding container a3536761d9d3c1d9fd7c3d7aeb20fcd9bec28578a05509685f828e8df787b467: Status 404 returned error can't find the container with id a3536761d9d3c1d9fd7c3d7aeb20fcd9bec28578a05509685f828e8df787b467 Apr 17 17:15:29.466719 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.466673 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerID="1ae8294369302ef33371498df0d0f17385b24cf1225a8abec93db21849838f10" exitCode=0 Apr 17 17:15:29.467165 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.466819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" event={"ID":"3b39ac81-0c1f-48ec-8cce-5d80f847e1db","Type":"ContainerDied","Data":"1ae8294369302ef33371498df0d0f17385b24cf1225a8abec93db21849838f10"} Apr 17 17:15:29.468140 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.468108 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" event={"ID":"7190378e-cdc3-4581-b7a0-d93b9cd31af8","Type":"ContainerStarted","Data":"a3536761d9d3c1d9fd7c3d7aeb20fcd9bec28578a05509685f828e8df787b467"} Apr 17 17:15:29.970529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.970476 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-tqt9s"] Apr 17 17:15:29.978089 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.978063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:29.980042 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.979989 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-tqt9s"] Apr 17 17:15:29.980542 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.980517 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:15:29.980916 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:29.980507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-kxkhc\"" Apr 17 17:15:30.001295 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.001273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.001395 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.001329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4df\" (UniqueName: \"kubernetes.io/projected/543f91a3-6c9c-4a1b-837f-670ccec0ff35-kube-api-access-pn4df\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.102431 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.102392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.102590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.102454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4df\" (UniqueName: \"kubernetes.io/projected/543f91a3-6c9c-4a1b-837f-670ccec0ff35-kube-api-access-pn4df\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.102590 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:30.102560 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:15:30.102693 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:30.102641 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert podName:543f91a3-6c9c-4a1b-837f-670ccec0ff35 nodeName:}" failed. No retries permitted until 2026-04-17 17:15:30.602616601 +0000 UTC m=+458.270300284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert") pod "odh-model-controller-858dbf95b8-tqt9s" (UID: "543f91a3-6c9c-4a1b-837f-670ccec0ff35") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:15:30.111797 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.111772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4df\" (UniqueName: \"kubernetes.io/projected/543f91a3-6c9c-4a1b-837f-670ccec0ff35-kube-api-access-pn4df\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.608004 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.607950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:30.608396 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:30.608115 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:15:30.608396 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:30.608190 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert podName:543f91a3-6c9c-4a1b-837f-670ccec0ff35 nodeName:}" failed. No retries permitted until 2026-04-17 17:15:31.60816831 +0000 UTC m=+459.275851952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert") pod "odh-model-controller-858dbf95b8-tqt9s" (UID: "543f91a3-6c9c-4a1b-837f-670ccec0ff35") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:15:30.663524 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.663498 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:30.708276 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.708252 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5k78\" (UniqueName: \"kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78\") pod \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " Apr 17 17:15:30.708414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.708283 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util\") pod \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " Apr 17 17:15:30.708414 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.708338 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle\") pod \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\" (UID: \"3b39ac81-0c1f-48ec-8cce-5d80f847e1db\") " Apr 17 17:15:30.709891 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.709862 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle" (OuterVolumeSpecName: "bundle") pod "3b39ac81-0c1f-48ec-8cce-5d80f847e1db" (UID: "3b39ac81-0c1f-48ec-8cce-5d80f847e1db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:30.710590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.710559 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78" (OuterVolumeSpecName: "kube-api-access-r5k78") pod "3b39ac81-0c1f-48ec-8cce-5d80f847e1db" (UID: "3b39ac81-0c1f-48ec-8cce-5d80f847e1db"). InnerVolumeSpecName "kube-api-access-r5k78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:15:30.715479 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.715313 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util" (OuterVolumeSpecName: "util") pod "3b39ac81-0c1f-48ec-8cce-5d80f847e1db" (UID: "3b39ac81-0c1f-48ec-8cce-5d80f847e1db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:30.809147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.809067 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5k78\" (UniqueName: \"kubernetes.io/projected/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-kube-api-access-r5k78\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:30.809147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.809104 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:30.809147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:30.809118 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b39ac81-0c1f-48ec-8cce-5d80f847e1db-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:31.478947 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.478908 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" event={"ID":"3b39ac81-0c1f-48ec-8cce-5d80f847e1db","Type":"ContainerDied","Data":"51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20"} Apr 17 17:15:31.478947 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.478951 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b9166661498243eb4743520ba9ec997f800741fc21cc109ce9a2ac14fabf20" Apr 17 17:15:31.479147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.478959 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9q87" Apr 17 17:15:31.615594 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.615550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:31.618337 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.618307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/543f91a3-6c9c-4a1b-837f-670ccec0ff35-cert\") pod \"odh-model-controller-858dbf95b8-tqt9s\" (UID: \"543f91a3-6c9c-4a1b-837f-670ccec0ff35\") " pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:31.792377 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:31.792303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:32.017736 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:32.017713 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-tqt9s"] Apr 17 17:15:32.020836 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:32.020813 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543f91a3_6c9c_4a1b_837f_670ccec0ff35.slice/crio-44377529befbb0470c0e76adf38a51ad8ae3e32951ea868ae7e7cb936594f6c1 WatchSource:0}: Error finding container 44377529befbb0470c0e76adf38a51ad8ae3e32951ea868ae7e7cb936594f6c1: Status 404 returned error can't find the container with id 44377529befbb0470c0e76adf38a51ad8ae3e32951ea868ae7e7cb936594f6c1 Apr 17 17:15:32.483102 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:32.483073 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" event={"ID":"543f91a3-6c9c-4a1b-837f-670ccec0ff35","Type":"ContainerStarted","Data":"44377529befbb0470c0e76adf38a51ad8ae3e32951ea868ae7e7cb936594f6c1"} Apr 17 17:15:32.484471 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:32.484448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" event={"ID":"7190378e-cdc3-4581-b7a0-d93b9cd31af8","Type":"ContainerStarted","Data":"125462bb7d010f18ab31f5dab1a665b213d562ee2382058005fd83f1bf15f811"} Apr 17 17:15:32.500190 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:32.500143 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-db5457dbf-xgnb5" podStartSLOduration=1.280797826 podStartE2EDuration="4.500126517s" podCreationTimestamp="2026-04-17 17:15:28 +0000 UTC" firstStartedPulling="2026-04-17 17:15:28.738507972 +0000 UTC m=+456.406191605" lastFinishedPulling="2026-04-17 17:15:31.957836657 +0000 UTC m=+459.625520296" observedRunningTime="2026-04-17 17:15:32.499135533 +0000 UTC m=+460.166819186" watchObservedRunningTime="2026-04-17 17:15:32.500126517 +0000 UTC m=+460.167810172" Apr 17 17:15:35.344905 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.344830 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zgh7t"] Apr 17 17:15:35.345331 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345312 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="extract" Apr 17 17:15:35.345403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345334 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="extract" Apr 17 17:15:35.345403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345349 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="pull" Apr 17 17:15:35.345403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345357 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="pull" Apr 17 17:15:35.345403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345378 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="util" Apr 17 17:15:35.345403 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345385 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="util" Apr 17 17:15:35.345700 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.345474 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b39ac81-0c1f-48ec-8cce-5d80f847e1db" containerName="extract" Apr 17 17:15:35.348408 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.348388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:35.351232 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.351203 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 17:15:35.351430 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.351418 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-lclbv\"" Apr 17 17:15:35.363238 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.363198 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zgh7t"] Apr 17 17:15:35.460157 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.460126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:35.460313 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.460159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-562rx\" (UniqueName: \"kubernetes.io/projected/a52b22a7-ef79-4ea3-9766-bb80d6394b58-kube-api-access-562rx\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:35.495849 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.495820 2572 generic.go:358] "Generic (PLEG): container finished" podID="543f91a3-6c9c-4a1b-837f-670ccec0ff35" containerID="0907a8e5bebe8e8120899b0c1c6e5193270a111264e7e227616a994c556da37f" exitCode=1 Apr 17 17:15:35.495978 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.495881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" event={"ID":"543f91a3-6c9c-4a1b-837f-670ccec0ff35","Type":"ContainerDied","Data":"0907a8e5bebe8e8120899b0c1c6e5193270a111264e7e227616a994c556da37f"} Apr 17 17:15:35.496081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.496067 2572 scope.go:117] "RemoveContainer" containerID="0907a8e5bebe8e8120899b0c1c6e5193270a111264e7e227616a994c556da37f" Apr 17 17:15:35.561535 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.561497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:35.561665 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.561539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-562rx\" (UniqueName: \"kubernetes.io/projected/a52b22a7-ef79-4ea3-9766-bb80d6394b58-kube-api-access-562rx\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:35.561731 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:35.561664 2572 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 17:15:35.561790 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:15:35.561740 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert podName:a52b22a7-ef79-4ea3-9766-bb80d6394b58 nodeName:}" failed. No retries permitted until 2026-04-17 17:15:36.061715944 +0000 UTC m=+463.729399582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert") pod "kserve-controller-manager-856948b99f-zgh7t" (UID: "a52b22a7-ef79-4ea3-9766-bb80d6394b58") : secret "kserve-webhook-server-cert" not found Apr 17 17:15:35.571051 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:35.571030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-562rx\" (UniqueName: \"kubernetes.io/projected/a52b22a7-ef79-4ea3-9766-bb80d6394b58-kube-api-access-562rx\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:36.065471 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.065390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:36.067879 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.067848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52b22a7-ef79-4ea3-9766-bb80d6394b58-cert\") pod \"kserve-controller-manager-856948b99f-zgh7t\" (UID: \"a52b22a7-ef79-4ea3-9766-bb80d6394b58\") " pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:36.258688 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.258653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:36.380367 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.380340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zgh7t"] Apr 17 17:15:36.383612 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:36.383549 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52b22a7_ef79_4ea3_9766_bb80d6394b58.slice/crio-4aa830b16b2e55963cdcab0b2783872340b52820b0ef8fdef95638ad2226410a WatchSource:0}: Error finding container 4aa830b16b2e55963cdcab0b2783872340b52820b0ef8fdef95638ad2226410a: Status 404 returned error can't find the container with id 4aa830b16b2e55963cdcab0b2783872340b52820b0ef8fdef95638ad2226410a Apr 17 17:15:36.501346 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.501307 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" event={"ID":"543f91a3-6c9c-4a1b-837f-670ccec0ff35","Type":"ContainerStarted","Data":"33d97e3f027a7abf71937be39fb7ab4888ee844dd854c5a44d9998f3f12027e4"} Apr 17 17:15:36.501526 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.501488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:36.502490 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.502462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" event={"ID":"a52b22a7-ef79-4ea3-9766-bb80d6394b58","Type":"ContainerStarted","Data":"4aa830b16b2e55963cdcab0b2783872340b52820b0ef8fdef95638ad2226410a"} Apr 17 17:15:36.518399 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:36.518352 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" podStartSLOduration=3.738759725 podStartE2EDuration="7.518340172s" podCreationTimestamp="2026-04-17 17:15:29 +0000 UTC" firstStartedPulling="2026-04-17 17:15:32.022543792 +0000 UTC m=+459.690227427" lastFinishedPulling="2026-04-17 17:15:35.802124238 +0000 UTC m=+463.469807874" observedRunningTime="2026-04-17 17:15:36.517085764 +0000 UTC m=+464.184769418" watchObservedRunningTime="2026-04-17 17:15:36.518340172 +0000 UTC m=+464.186023826" Apr 17 17:15:39.514953 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:39.514917 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" event={"ID":"a52b22a7-ef79-4ea3-9766-bb80d6394b58","Type":"ContainerStarted","Data":"d3034eab8e51a24d147c2a9a981c87d2da787359a5121977dc6aeb1ef82a3718"} Apr 17 17:15:39.515392 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:39.515003 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:15:39.550146 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:39.550096 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" podStartSLOduration=1.7950747329999999 podStartE2EDuration="4.55008043s" podCreationTimestamp="2026-04-17 17:15:35 +0000 UTC" firstStartedPulling="2026-04-17 17:15:36.385492052 +0000 UTC m=+464.053175684" lastFinishedPulling="2026-04-17 17:15:39.140497749 +0000 UTC m=+466.808181381" observedRunningTime="2026-04-17 17:15:39.547824779 +0000 UTC m=+467.215508432" watchObservedRunningTime="2026-04-17 17:15:39.55008043 +0000 UTC m=+467.217764085" Apr 17 17:15:40.954020 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.953990 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs"] Apr 17 17:15:40.957458 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.957439 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:40.961191 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.961171 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2vk2g\"" Apr 17 17:15:40.961322 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.961262 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:15:40.962362 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.962347 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:15:40.982628 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:40.982602 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs"] Apr 17 17:15:41.004963 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.004931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtw8\" (UniqueName: \"kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.005112 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.004978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.005177 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.005120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.106456 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.106432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.106562 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.106480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtw8\" (UniqueName: \"kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.106562 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.106502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.106828 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.106811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.106868 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.106857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.121114 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.121091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtw8\" (UniqueName: \"kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.266856 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.266792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:41.433548 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.433511 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs"] Apr 17 17:15:41.433835 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:41.433807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6590ccce_0a86_4725_a3ff_da68cd527a7f.slice/crio-4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644 WatchSource:0}: Error finding container 4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644: Status 404 returned error can't find the container with id 4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644 Apr 17 17:15:41.523324 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.523250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerStarted","Data":"957aee58f55368c7a31079897d1a94e3e698f9c9183701c0689e30a51f97412b"} Apr 17 17:15:41.523324 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:41.523297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerStarted","Data":"4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644"} Apr 17 17:15:42.238353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.238316 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-grws5"] Apr 17 17:15:42.241648 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.241629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.245676 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.245656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xz69l\"" Apr 17 17:15:42.246117 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.246096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 17:15:42.246235 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.246129 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 17:15:42.262631 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.262608 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-grws5"] Apr 17 17:15:42.318527 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.318494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9ea4f9b6-9453-49e6-bbcb-127ff98db954-operator-config\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.318678 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.318567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnrh\" (UniqueName: \"kubernetes.io/projected/9ea4f9b6-9453-49e6-bbcb-127ff98db954-kube-api-access-jgnrh\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.419370 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.419302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9ea4f9b6-9453-49e6-bbcb-127ff98db954-operator-config\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.419471 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.419382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnrh\" (UniqueName: \"kubernetes.io/projected/9ea4f9b6-9453-49e6-bbcb-127ff98db954-kube-api-access-jgnrh\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.421837 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.421814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9ea4f9b6-9453-49e6-bbcb-127ff98db954-operator-config\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.432627 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.432601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnrh\" (UniqueName: \"kubernetes.io/projected/9ea4f9b6-9453-49e6-bbcb-127ff98db954-kube-api-access-jgnrh\") pod \"servicemesh-operator3-55f49c5f94-grws5\" (UID: \"9ea4f9b6-9453-49e6-bbcb-127ff98db954\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.528540 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.528515 2572 generic.go:358] "Generic (PLEG): container finished" podID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerID="957aee58f55368c7a31079897d1a94e3e698f9c9183701c0689e30a51f97412b" exitCode=0 Apr 17 17:15:42.528664 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.528607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerDied","Data":"957aee58f55368c7a31079897d1a94e3e698f9c9183701c0689e30a51f97412b"} Apr 17 17:15:42.550543 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.550518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:42.674805 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:42.674780 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-grws5"] Apr 17 17:15:42.676358 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:15:42.676327 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea4f9b6_9453_49e6_bbcb_127ff98db954.slice/crio-d8ee94c42bd3e8c4adbf4f1b4c3ba80ac0aeaf43127fa3d1b9613e9d89e91d6e WatchSource:0}: Error finding container d8ee94c42bd3e8c4adbf4f1b4c3ba80ac0aeaf43127fa3d1b9613e9d89e91d6e: Status 404 returned error can't find the container with id d8ee94c42bd3e8c4adbf4f1b4c3ba80ac0aeaf43127fa3d1b9613e9d89e91d6e Apr 17 17:15:43.536936 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:43.536892 2572 generic.go:358] "Generic (PLEG): container finished" podID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerID="dd59ad4edaaf887e98a981bd7bf517d8d686de90bf8d246ba96163bf82c7e24d" exitCode=0 Apr 17 17:15:43.537418 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:43.536940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerDied","Data":"dd59ad4edaaf887e98a981bd7bf517d8d686de90bf8d246ba96163bf82c7e24d"} Apr 17 17:15:43.539378 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:43.539242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" event={"ID":"9ea4f9b6-9453-49e6-bbcb-127ff98db954","Type":"ContainerStarted","Data":"d8ee94c42bd3e8c4adbf4f1b4c3ba80ac0aeaf43127fa3d1b9613e9d89e91d6e"} Apr 17 17:15:44.544656 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:44.544623 2572 generic.go:358] "Generic (PLEG): container finished" podID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerID="f7b7e9073bd31f35769c8026ab87028709b7716917a753b6e3c0743b81cb2c03" exitCode=0 Apr 17 17:15:44.545036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:44.544698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerDied","Data":"f7b7e9073bd31f35769c8026ab87028709b7716917a753b6e3c0743b81cb2c03"} Apr 17 17:15:45.675354 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.675325 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:45.751989 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.751947 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvtw8\" (UniqueName: \"kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8\") pod \"6590ccce-0a86-4725-a3ff-da68cd527a7f\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " Apr 17 17:15:45.752186 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.752064 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle\") pod \"6590ccce-0a86-4725-a3ff-da68cd527a7f\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " Apr 17 17:15:45.752186 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.752101 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util\") pod \"6590ccce-0a86-4725-a3ff-da68cd527a7f\" (UID: \"6590ccce-0a86-4725-a3ff-da68cd527a7f\") " Apr 17 17:15:45.753153 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.753084 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle" (OuterVolumeSpecName: "bundle") pod "6590ccce-0a86-4725-a3ff-da68cd527a7f" (UID: "6590ccce-0a86-4725-a3ff-da68cd527a7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:45.754634 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.754606 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8" (OuterVolumeSpecName: "kube-api-access-bvtw8") pod "6590ccce-0a86-4725-a3ff-da68cd527a7f" (UID: "6590ccce-0a86-4725-a3ff-da68cd527a7f"). InnerVolumeSpecName "kube-api-access-bvtw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:15:45.759359 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.759334 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util" (OuterVolumeSpecName: "util") pod "6590ccce-0a86-4725-a3ff-da68cd527a7f" (UID: "6590ccce-0a86-4725-a3ff-da68cd527a7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:45.852864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.852779 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:45.852864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.852809 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6590ccce-0a86-4725-a3ff-da68cd527a7f-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:45.852864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:45.852820 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvtw8\" (UniqueName: \"kubernetes.io/projected/6590ccce-0a86-4725-a3ff-da68cd527a7f-kube-api-access-bvtw8\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:15:46.553548 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.553513 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" Apr 17 17:15:46.553712 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.553550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwqs" event={"ID":"6590ccce-0a86-4725-a3ff-da68cd527a7f","Type":"ContainerDied","Data":"4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644"} Apr 17 17:15:46.553712 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.553587 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e62b83f6b384446b1f65a721ca362c50196a16bb673b2b757dd6425a0a3b644" Apr 17 17:15:46.555132 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.555098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" event={"ID":"9ea4f9b6-9453-49e6-bbcb-127ff98db954","Type":"ContainerStarted","Data":"5e73687e8f27982ec7d4b9e8094d33742f5b5e0e03109b3f240fc81c0eba76ef"} Apr 17 17:15:46.555263 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.555247 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:15:46.576797 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:46.576754 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" podStartSLOduration=1.73220592 podStartE2EDuration="4.576739279s" podCreationTimestamp="2026-04-17 17:15:42 +0000 UTC" firstStartedPulling="2026-04-17 17:15:42.678739788 +0000 UTC m=+470.346423420" lastFinishedPulling="2026-04-17 17:15:45.523273147 +0000 UTC m=+473.190956779" observedRunningTime="2026-04-17 17:15:46.574864535 +0000 UTC m=+474.242548189" watchObservedRunningTime="2026-04-17 17:15:46.576739279 +0000 UTC m=+474.244422935" Apr 17 17:15:47.507946 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:47.507911 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-tqt9s" Apr 17 17:15:57.560830 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:15:57.560800 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-grws5" Apr 17 17:16:10.522920 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:10.522888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-zgh7t" Apr 17 17:16:12.762881 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.762842 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s"] Apr 17 17:16:12.763244 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763228 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="util" Apr 17 17:16:12.763244 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763241 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="util" Apr 17 17:16:12.763317 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763255 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="pull" Apr 17 17:16:12.763317 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763261 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="pull" Apr 17 17:16:12.763317 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763269 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="extract" Apr 17 17:16:12.763317 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763275 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="extract" Apr 17 17:16:12.763438 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.763337 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6590ccce-0a86-4725-a3ff-da68cd527a7f" containerName="extract" Apr 17 17:16:12.766694 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.766677 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.769279 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.769260 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 17:16:12.769422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.769401 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 17:16:12.769930 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.769911 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 17:16:12.770002 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.769951 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 17:16:12.770002 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.769919 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-scpb4\"" Apr 17 17:16:12.776692 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.776672 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s"] Apr 17 17:16:12.945249 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945199 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e161db9b-4d23-41e2-9c33-155fcf18d401-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2zx\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-kube-api-access-cm2zx\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945400 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945546 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:12.945546 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:12.945455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046788 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e161db9b-4d23-41e2-9c33-155fcf18d401-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046788 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2zx\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-kube-api-access-cm2zx\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046788 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.046788 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.046748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.047298 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.047261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.049042 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.049020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.049147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.049114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e161db9b-4d23-41e2-9c33-155fcf18d401-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.049565 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.049544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.049610 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.049550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.054753 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.054722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2zx\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-kube-api-access-cm2zx\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.054875 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.054862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e161db9b-4d23-41e2-9c33-155fcf18d401-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gsx6s\" (UID: \"e161db9b-4d23-41e2-9c33-155fcf18d401\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.076908 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.076885 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:13.206568 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.205937 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s"] Apr 17 17:16:13.651344 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:13.651305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" event={"ID":"e161db9b-4d23-41e2-9c33-155fcf18d401","Type":"ContainerStarted","Data":"c84b58726274197ef4acb2f500a037dc7a885e791945f3275d56a5109ef02f8c"} Apr 17 17:16:16.195023 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:16.194982 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:16:16.195311 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:16.195052 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:16:16.666622 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:16.666538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" event={"ID":"e161db9b-4d23-41e2-9c33-155fcf18d401","Type":"ContainerStarted","Data":"e024e5d632961effb9042a62255e1930415def6861cdbf21f892d28ffba0042f"} Apr 17 17:16:16.666622 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:16.666590 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:16.688479 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:16.688426 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" podStartSLOduration=1.704531697 podStartE2EDuration="4.688414257s" podCreationTimestamp="2026-04-17 17:16:12 +0000 UTC" firstStartedPulling="2026-04-17 17:16:13.210875136 +0000 UTC m=+500.878558771" lastFinishedPulling="2026-04-17 17:16:16.194757699 +0000 UTC m=+503.862441331" observedRunningTime="2026-04-17 17:16:16.686698757 +0000 UTC m=+504.354382410" watchObservedRunningTime="2026-04-17 17:16:16.688414257 +0000 UTC m=+504.356097910" Apr 17 17:16:17.672356 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:17.672325 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gsx6s" Apr 17 17:16:41.287962 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.287730 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk"] Apr 17 17:16:41.292807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.292774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.295476 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.295438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:16:41.295476 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.295438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:16:41.296384 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.296357 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8n2w7\"" Apr 17 17:16:41.296904 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.296884 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk"] Apr 17 17:16:41.375874 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.375835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.376078 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.375897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5lf\" (UniqueName: \"kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.376078 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.375992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.477096 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.477034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5lf\" (UniqueName: \"kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.477342 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.477119 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.477342 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.477202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.477579 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.477558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.477622 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.477592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.485984 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.485945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5lf\" (UniqueName: \"kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.604493 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.604379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:41.742478 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.742441 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk"] Apr 17 17:16:41.744240 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:16:41.744178 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c845075_4b05_457f_a97f_3ca76399588e.slice/crio-ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe WatchSource:0}: Error finding container ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe: Status 404 returned error can't find the container with id ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe Apr 17 17:16:41.766616 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.766584 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" event={"ID":"2c845075-4b05-457f-a97f-3ca76399588e","Type":"ContainerStarted","Data":"ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe"} Apr 17 17:16:41.894546 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.894449 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm"] Apr 17 17:16:41.897928 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.897906 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:41.905787 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.905754 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm"] Apr 17 17:16:41.982958 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.982912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:41.982958 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.982958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:41.983248 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:41.983066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkwt\" (UniqueName: \"kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.083551 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.083510 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.083551 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.083553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.083800 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.083616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkwt\" (UniqueName: \"kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.084004 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.083980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.084080 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.084041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.092013 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.091979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkwt\" (UniqueName: \"kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.215866 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.215825 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:42.351046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.351006 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm"] Apr 17 17:16:42.354668 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:16:42.354634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3f5eb4_82f0_4768_a1c6_a2ca68225d7c.slice/crio-d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a WatchSource:0}: Error finding container d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a: Status 404 returned error can't find the container with id d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a Apr 17 17:16:42.483341 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.483248 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb"] Apr 17 17:16:42.487022 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.486991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.494424 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.494388 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb"] Apr 17 17:16:42.588586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.588546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrkt\" (UniqueName: \"kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.588766 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.588602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.588766 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.588663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.689957 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.689913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrkt\" (UniqueName: \"kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.690123 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.689975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.690123 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.690008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.690548 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.690521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.690595 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.690532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.698081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.698050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrkt\" (UniqueName: \"kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.772234 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.772119 2572 generic.go:358] "Generic (PLEG): container finished" podID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerID="a416d422c483bd1df8f0dae8eedbac30ed0dc4b5c583a326e66f1616da9eb107" exitCode=0 Apr 17 17:16:42.772234 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.772220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" event={"ID":"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c","Type":"ContainerDied","Data":"a416d422c483bd1df8f0dae8eedbac30ed0dc4b5c583a326e66f1616da9eb107"} Apr 17 17:16:42.772443 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.772263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" event={"ID":"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c","Type":"ContainerStarted","Data":"d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a"} Apr 17 17:16:42.774005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.773975 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c845075-4b05-457f-a97f-3ca76399588e" containerID="1df1ceb6a8134e436b24e500a73b20e52da80765928ddc5f78a50923f3bf5569" exitCode=0 Apr 17 17:16:42.774005 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.774035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" event={"ID":"2c845075-4b05-457f-a97f-3ca76399588e","Type":"ContainerDied","Data":"1df1ceb6a8134e436b24e500a73b20e52da80765928ddc5f78a50923f3bf5569"} Apr 17 17:16:42.837806 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.837761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:42.978035 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:42.977973 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb"] Apr 17 17:16:42.980703 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:16:42.980669 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2fdaaa_222d_4cf4_aff4_3c58136512b3.slice/crio-b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26 WatchSource:0}: Error finding container b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26: Status 404 returned error can't find the container with id b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26 Apr 17 17:16:43.088956 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.088919 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5"] Apr 17 17:16:43.092445 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.092417 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.110117 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.110080 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5"] Apr 17 17:16:43.195662 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.195614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8kr\" (UniqueName: \"kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.195829 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.195730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.195829 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.195757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.296987 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.296884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8kr\" (UniqueName: \"kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.296987 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.296965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.297187 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.296992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.297488 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.297465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.297531 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.297473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.305961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.305934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8kr\" (UniqueName: \"kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.455815 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.455770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:43.713803 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.713771 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5"] Apr 17 17:16:43.716290 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:16:43.716256 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3ddf39_29a8_4769_a4af_984d468f356c.slice/crio-f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0 WatchSource:0}: Error finding container f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0: Status 404 returned error can't find the container with id f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0 Apr 17 17:16:43.780256 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.780190 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c845075-4b05-457f-a97f-3ca76399588e" containerID="1022d0f491c04a656daa86f0cf949a63bb7a1760f7a16763f552c06daeddd6bd" exitCode=0 Apr 17 17:16:43.780456 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.780261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" event={"ID":"2c845075-4b05-457f-a97f-3ca76399588e","Type":"ContainerDied","Data":"1022d0f491c04a656daa86f0cf949a63bb7a1760f7a16763f552c06daeddd6bd"} Apr 17 17:16:43.782089 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.782049 2572 generic.go:358] "Generic (PLEG): container finished" podID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerID="34e57793ce3f2cadd0eae91262ae65232154e80f1ec26cf2c7006b4632a80778" exitCode=0 Apr 17 17:16:43.782175 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.782130 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" event={"ID":"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c","Type":"ContainerDied","Data":"34e57793ce3f2cadd0eae91262ae65232154e80f1ec26cf2c7006b4632a80778"} Apr 17 17:16:43.783532 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.783502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" event={"ID":"ea3ddf39-29a8-4769-a4af-984d468f356c","Type":"ContainerStarted","Data":"f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0"} Apr 17 17:16:43.784975 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.784951 2572 generic.go:358] "Generic (PLEG): container finished" podID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerID="4836e8f99b4a4d66d3238dc3006f2eb72b34364dea9f7393b2795c6b86ad7a9b" exitCode=0 Apr 17 17:16:43.785071 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.785034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" event={"ID":"8f2fdaaa-222d-4cf4-aff4-3c58136512b3","Type":"ContainerDied","Data":"4836e8f99b4a4d66d3238dc3006f2eb72b34364dea9f7393b2795c6b86ad7a9b"} Apr 17 17:16:43.785071 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:43.785066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" event={"ID":"8f2fdaaa-222d-4cf4-aff4-3c58136512b3","Type":"ContainerStarted","Data":"b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26"} Apr 17 17:16:44.790934 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.790837 2572 generic.go:358] "Generic (PLEG): container finished" podID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerID="33ee4215c95d5ae757e451c85d627372087c337fb3f621181504af0b1353efe9" exitCode=0 Apr 17 17:16:44.791381 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.790936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" event={"ID":"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c","Type":"ContainerDied","Data":"33ee4215c95d5ae757e451c85d627372087c337fb3f621181504af0b1353efe9"} Apr 17 17:16:44.792347 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.792320 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerID="8b1d0daeae42ec195cf45f4430a2385a7158a6d5fd6caedf637f7eaeec0eb8ce" exitCode=0 Apr 17 17:16:44.792506 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.792374 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" event={"ID":"ea3ddf39-29a8-4769-a4af-984d468f356c","Type":"ContainerDied","Data":"8b1d0daeae42ec195cf45f4430a2385a7158a6d5fd6caedf637f7eaeec0eb8ce"} Apr 17 17:16:44.794166 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.794128 2572 generic.go:358] "Generic (PLEG): container finished" podID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerID="f939c99b9c423420b815bde8380eb5ad1228763e4e496ad1686fc14e2232f2b1" exitCode=0 Apr 17 17:16:44.794285 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.794248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" event={"ID":"8f2fdaaa-222d-4cf4-aff4-3c58136512b3","Type":"ContainerDied","Data":"f939c99b9c423420b815bde8380eb5ad1228763e4e496ad1686fc14e2232f2b1"} Apr 17 17:16:44.796307 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.796279 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c845075-4b05-457f-a97f-3ca76399588e" containerID="4f6c012e7c0720feec5bf11989dc3561143947e3a051344f8b094e69e8b008d9" exitCode=0 Apr 17 17:16:44.796407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:44.796332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" event={"ID":"2c845075-4b05-457f-a97f-3ca76399588e","Type":"ContainerDied","Data":"4f6c012e7c0720feec5bf11989dc3561143947e3a051344f8b094e69e8b008d9"} Apr 17 17:16:45.802399 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:45.802306 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerID="5a84ae771de4cada35348a258b3bad2febc73d1b5998a2380618618b97cb2bfc" exitCode=0 Apr 17 17:16:45.802867 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:45.802396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" event={"ID":"ea3ddf39-29a8-4769-a4af-984d468f356c","Type":"ContainerDied","Data":"5a84ae771de4cada35348a258b3bad2febc73d1b5998a2380618618b97cb2bfc"} Apr 17 17:16:45.804514 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:45.804486 2572 generic.go:358] "Generic (PLEG): container finished" podID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerID="dd6dade947ac37559d7083d98c9f6040d8380588a347444086de535fdaab1675" exitCode=0 Apr 17 17:16:45.804609 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:45.804545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" event={"ID":"8f2fdaaa-222d-4cf4-aff4-3c58136512b3","Type":"ContainerDied","Data":"dd6dade947ac37559d7083d98c9f6040d8380588a347444086de535fdaab1675"} Apr 17 17:16:45.946429 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:45.946390 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:46.003659 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.003634 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:46.023852 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.023813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle\") pod \"2c845075-4b05-457f-a97f-3ca76399588e\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " Apr 17 17:16:46.024065 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.023861 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util\") pod \"2c845075-4b05-457f-a97f-3ca76399588e\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " Apr 17 17:16:46.024065 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.023893 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5lf\" (UniqueName: \"kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf\") pod \"2c845075-4b05-457f-a97f-3ca76399588e\" (UID: \"2c845075-4b05-457f-a97f-3ca76399588e\") " Apr 17 17:16:46.024856 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.024827 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle" (OuterVolumeSpecName: "bundle") pod "2c845075-4b05-457f-a97f-3ca76399588e" (UID: "2c845075-4b05-457f-a97f-3ca76399588e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:46.026901 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.026868 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf" (OuterVolumeSpecName: "kube-api-access-hr5lf") pod "2c845075-4b05-457f-a97f-3ca76399588e" (UID: "2c845075-4b05-457f-a97f-3ca76399588e"). InnerVolumeSpecName "kube-api-access-hr5lf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:16:46.029974 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.029944 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util" (OuterVolumeSpecName: "util") pod "2c845075-4b05-457f-a97f-3ca76399588e" (UID: "2c845075-4b05-457f-a97f-3ca76399588e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:46.124949 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.124852 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle\") pod \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " Apr 17 17:16:46.124949 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.124893 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util\") pod \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " Apr 17 17:16:46.125146 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.124989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkwt\" (UniqueName: \"kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt\") pod \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\" (UID: \"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c\") " Apr 17 17:16:46.125146 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.125122 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.125146 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.125131 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c845075-4b05-457f-a97f-3ca76399588e-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.125146 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.125141 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hr5lf\" (UniqueName: \"kubernetes.io/projected/2c845075-4b05-457f-a97f-3ca76399588e-kube-api-access-hr5lf\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.125395 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.125352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle" (OuterVolumeSpecName: "bundle") pod "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" (UID: "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:46.127164 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.127134 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt" (OuterVolumeSpecName: "kube-api-access-cvkwt") pod "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" (UID: "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c"). InnerVolumeSpecName "kube-api-access-cvkwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:16:46.130263 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.130236 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util" (OuterVolumeSpecName: "util") pod "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" (UID: "7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:46.226498 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.226452 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.226498 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.226489 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.226498 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.226499 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvkwt\" (UniqueName: \"kubernetes.io/projected/7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c-kube-api-access-cvkwt\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:46.817393 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.817259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" event={"ID":"7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c","Type":"ContainerDied","Data":"d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a"} Apr 17 17:16:46.817393 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.817333 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31e6a9a3352f8c6e76b1ccf1f62036213cfc7c30a9e10e26eb1175641c8623a" Apr 17 17:16:46.817393 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.817352 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm" Apr 17 17:16:46.819396 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.819365 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerID="773f7dcf3250913b27573bd732ffad3094ea267f732be7c4686d91b47d1258e1" exitCode=0 Apr 17 17:16:46.819544 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.819444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" event={"ID":"ea3ddf39-29a8-4769-a4af-984d468f356c","Type":"ContainerDied","Data":"773f7dcf3250913b27573bd732ffad3094ea267f732be7c4686d91b47d1258e1"} Apr 17 17:16:46.821188 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.821162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" event={"ID":"2c845075-4b05-457f-a97f-3ca76399588e","Type":"ContainerDied","Data":"ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe"} Apr 17 17:16:46.821188 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.821181 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk" Apr 17 17:16:46.821188 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.821192 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7075c03b7862b67e168716928bea21caa59ecdc64bda6300917dbe36c7babe" Apr 17 17:16:46.955519 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:46.955494 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:47.033646 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.033606 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util\") pod \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " Apr 17 17:16:47.033850 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.033656 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle\") pod \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " Apr 17 17:16:47.033850 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.033727 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmrkt\" (UniqueName: \"kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt\") pod \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\" (UID: \"8f2fdaaa-222d-4cf4-aff4-3c58136512b3\") " Apr 17 17:16:47.034201 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.034171 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle" (OuterVolumeSpecName: "bundle") pod "8f2fdaaa-222d-4cf4-aff4-3c58136512b3" (UID: "8f2fdaaa-222d-4cf4-aff4-3c58136512b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:47.035916 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.035880 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt" (OuterVolumeSpecName: "kube-api-access-fmrkt") pod "8f2fdaaa-222d-4cf4-aff4-3c58136512b3" (UID: "8f2fdaaa-222d-4cf4-aff4-3c58136512b3"). InnerVolumeSpecName "kube-api-access-fmrkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:16:47.041425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.041383 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util" (OuterVolumeSpecName: "util") pod "8f2fdaaa-222d-4cf4-aff4-3c58136512b3" (UID: "8f2fdaaa-222d-4cf4-aff4-3c58136512b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:47.135375 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.135265 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmrkt\" (UniqueName: \"kubernetes.io/projected/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-kube-api-access-fmrkt\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:47.135375 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.135306 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:47.135375 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.135321 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2fdaaa-222d-4cf4-aff4-3c58136512b3-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:47.827010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.826978 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" Apr 17 17:16:47.827010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.826985 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb" event={"ID":"8f2fdaaa-222d-4cf4-aff4-3c58136512b3","Type":"ContainerDied","Data":"b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26"} Apr 17 17:16:47.827010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.827015 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b216f7a10ff73e052a624130c2fa1907691377e8dbca5e098e3a05dcd5b49b26" Apr 17 17:16:47.961469 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:47.961444 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:48.043288 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.043251 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle\") pod \"ea3ddf39-29a8-4769-a4af-984d468f356c\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " Apr 17 17:16:48.043495 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.043322 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util\") pod \"ea3ddf39-29a8-4769-a4af-984d468f356c\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " Apr 17 17:16:48.043495 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.043382 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8kr\" (UniqueName: \"kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr\") pod \"ea3ddf39-29a8-4769-a4af-984d468f356c\" (UID: \"ea3ddf39-29a8-4769-a4af-984d468f356c\") " Apr 17 17:16:48.043865 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.043833 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle" (OuterVolumeSpecName: "bundle") pod "ea3ddf39-29a8-4769-a4af-984d468f356c" (UID: "ea3ddf39-29a8-4769-a4af-984d468f356c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:48.045643 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.045613 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr" (OuterVolumeSpecName: "kube-api-access-5h8kr") pod "ea3ddf39-29a8-4769-a4af-984d468f356c" (UID: "ea3ddf39-29a8-4769-a4af-984d468f356c"). InnerVolumeSpecName "kube-api-access-5h8kr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:16:48.048423 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.048381 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util" (OuterVolumeSpecName: "util") pod "ea3ddf39-29a8-4769-a4af-984d468f356c" (UID: "ea3ddf39-29a8-4769-a4af-984d468f356c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:16:48.144962 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.144867 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-util\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:48.144962 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.144905 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5h8kr\" (UniqueName: \"kubernetes.io/projected/ea3ddf39-29a8-4769-a4af-984d468f356c-kube-api-access-5h8kr\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:48.144962 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.144916 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea3ddf39-29a8-4769-a4af-984d468f356c-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:16:48.833133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.833094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" event={"ID":"ea3ddf39-29a8-4769-a4af-984d468f356c","Type":"ContainerDied","Data":"f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0"} Apr 17 17:16:48.833133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.833134 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f17719bb30cf38080eebcf643586cc52dc6eaa99bc6036feaca11e99053506b0" Apr 17 17:16:48.833602 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:48.833158 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5" Apr 17 17:16:54.254818 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.254766 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85c88ddcb8-f5pfv"] Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255132 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="extract" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255143 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="extract" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255154 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="pull" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255160 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="pull" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255171 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="util" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255177 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="util" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255185 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="util" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255190 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="util" Apr 17 17:16:54.255194 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255196 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255201 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255241 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255247 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255255 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255261 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="pull" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255270 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="util" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255276 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="util" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255283 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255288 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255295 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255300 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255306 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255311 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255321 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="util" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255325 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="util" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255379 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f2fdaaa-222d-4cf4-aff4-3c58136512b3" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255388 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255394 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c845075-4b05-457f-a97f-3ca76399588e" containerName="extract" Apr 17 17:16:54.255511 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.255401 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea3ddf39-29a8-4769-a4af-984d468f356c" containerName="extract" Apr 17 17:16:54.264769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.264740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.272076 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.272041 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85c88ddcb8-f5pfv"] Apr 17 17:16:54.401504 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401504 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-console-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401751 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhgz\" (UniqueName: \"kubernetes.io/projected/1551c43c-26fe-414a-a5cc-4073a15ede45-kube-api-access-4hhgz\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401751 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-trusted-ca-bundle\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401751 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-oauth-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401751 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-oauth-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.401896 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.401753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-service-ca\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.502670 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-oauth-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.502859 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-oauth-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.502859 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-service-ca\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.502859 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.503046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502866 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-console-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.503046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.502949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhgz\" (UniqueName: \"kubernetes.io/projected/1551c43c-26fe-414a-a5cc-4073a15ede45-kube-api-access-4hhgz\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.503046 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.503013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-trusted-ca-bundle\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.503941 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.503913 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-oauth-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.504141 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.503953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-console-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.504286 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.503980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-service-ca\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.504673 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.504647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1551c43c-26fe-414a-a5cc-4073a15ede45-trusted-ca-bundle\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.510907 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.506288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-serving-cert\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.510907 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.506586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1551c43c-26fe-414a-a5cc-4073a15ede45-console-oauth-config\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.513180 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.513149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhgz\" (UniqueName: \"kubernetes.io/projected/1551c43c-26fe-414a-a5cc-4073a15ede45-kube-api-access-4hhgz\") pod \"console-85c88ddcb8-f5pfv\" (UID: \"1551c43c-26fe-414a-a5cc-4073a15ede45\") " pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.576535 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.576480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:16:54.716501 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.716461 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85c88ddcb8-f5pfv"] Apr 17 17:16:54.720061 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:16:54.720026 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1551c43c_26fe_414a_a5cc_4073a15ede45.slice/crio-586b088c812a50dc56ea622a78ade651e0ba3a5207fe43856110de4b2a29f0c4 WatchSource:0}: Error finding container 586b088c812a50dc56ea622a78ade651e0ba3a5207fe43856110de4b2a29f0c4: Status 404 returned error can't find the container with id 586b088c812a50dc56ea622a78ade651e0ba3a5207fe43856110de4b2a29f0c4 Apr 17 17:16:54.856452 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.856390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c88ddcb8-f5pfv" event={"ID":"1551c43c-26fe-414a-a5cc-4073a15ede45","Type":"ContainerStarted","Data":"cd49e2495321e651aaba7021b354b5a1ef88b8b53ae3d4b8778fef10c9b22242"} Apr 17 17:16:54.856452 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.856438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c88ddcb8-f5pfv" event={"ID":"1551c43c-26fe-414a-a5cc-4073a15ede45","Type":"ContainerStarted","Data":"586b088c812a50dc56ea622a78ade651e0ba3a5207fe43856110de4b2a29f0c4"} Apr 17 17:16:54.918167 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:16:54.918111 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85c88ddcb8-f5pfv" podStartSLOduration=0.918095256 podStartE2EDuration="918.095256ms" podCreationTimestamp="2026-04-17 17:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:16:54.916859378 +0000 UTC m=+542.584543056" watchObservedRunningTime="2026-04-17 17:16:54.918095256 +0000 UTC m=+542.585778909" Apr 17 17:17:02.041090 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.041048 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql"] Apr 17 17:17:02.044769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.044742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:02.047589 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.047562 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-2r9mj\"" Apr 17 17:17:02.047721 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.047571 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:17:02.048525 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.048490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:17:02.056237 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.056190 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql"] Apr 17 17:17:02.174281 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.174190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckljb\" (UniqueName: \"kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb\") pod \"limitador-operator-controller-manager-85c4996f8c-6vjql\" (UID: \"29f3c9a1-23ca-476e-87fe-3a0592be9512\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:02.275059 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.275012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckljb\" (UniqueName: \"kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb\") pod \"limitador-operator-controller-manager-85c4996f8c-6vjql\" (UID: \"29f3c9a1-23ca-476e-87fe-3a0592be9512\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:02.301746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.301668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckljb\" (UniqueName: \"kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb\") pod \"limitador-operator-controller-manager-85c4996f8c-6vjql\" (UID: \"29f3c9a1-23ca-476e-87fe-3a0592be9512\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:02.356961 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.356917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:02.502563 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.502532 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql"] Apr 17 17:17:02.505012 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:17:02.504981 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f3c9a1_23ca_476e_87fe_3a0592be9512.slice/crio-523e71a00446089bc63c7be2c198e0cdd576714eb386632e39bd0fd555d14b81 WatchSource:0}: Error finding container 523e71a00446089bc63c7be2c198e0cdd576714eb386632e39bd0fd555d14b81: Status 404 returned error can't find the container with id 523e71a00446089bc63c7be2c198e0cdd576714eb386632e39bd0fd555d14b81 Apr 17 17:17:02.890128 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:02.890085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" event={"ID":"29f3c9a1-23ca-476e-87fe-3a0592be9512","Type":"ContainerStarted","Data":"523e71a00446089bc63c7be2c198e0cdd576714eb386632e39bd0fd555d14b81"} Apr 17 17:17:04.577047 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:04.577001 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:17:04.577578 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:04.577104 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:17:04.583330 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:04.583298 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:17:04.902587 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:04.902510 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85c88ddcb8-f5pfv" Apr 17 17:17:04.958869 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:04.958838 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:17:05.903485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:05.903447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" event={"ID":"29f3c9a1-23ca-476e-87fe-3a0592be9512","Type":"ContainerStarted","Data":"a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51"} Apr 17 17:17:05.926651 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:05.926585 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" podStartSLOduration=1.477317818 podStartE2EDuration="3.92656686s" podCreationTimestamp="2026-04-17 17:17:02 +0000 UTC" firstStartedPulling="2026-04-17 17:17:02.507008063 +0000 UTC m=+550.174691695" lastFinishedPulling="2026-04-17 17:17:04.956257091 +0000 UTC m=+552.623940737" observedRunningTime="2026-04-17 17:17:05.925036318 +0000 UTC m=+553.592719971" watchObservedRunningTime="2026-04-17 17:17:05.92656686 +0000 UTC m=+553.594250514" Apr 17 17:17:06.907435 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:06.907401 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:07.895523 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:07.895483 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs"] Apr 17 17:17:07.899128 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:07.899108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:07.901746 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:07.901715 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 17:17:07.901900 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:07.901820 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-qz759\"" Apr 17 17:17:07.910371 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:07.910341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs"] Apr 17 17:17:08.036177 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.036128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsfk\" (UniqueName: \"kubernetes.io/projected/f5ed6e24-4094-4171-8cf1-63db13dfd3eb-kube-api-access-7vsfk\") pod \"dns-operator-controller-manager-648d5c98bc-btkxs\" (UID: \"f5ed6e24-4094-4171-8cf1-63db13dfd3eb\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:08.137510 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.137465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsfk\" (UniqueName: \"kubernetes.io/projected/f5ed6e24-4094-4171-8cf1-63db13dfd3eb-kube-api-access-7vsfk\") pod \"dns-operator-controller-manager-648d5c98bc-btkxs\" (UID: \"f5ed6e24-4094-4171-8cf1-63db13dfd3eb\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:08.146626 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.146545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsfk\" (UniqueName: \"kubernetes.io/projected/f5ed6e24-4094-4171-8cf1-63db13dfd3eb-kube-api-access-7vsfk\") pod \"dns-operator-controller-manager-648d5c98bc-btkxs\" (UID: \"f5ed6e24-4094-4171-8cf1-63db13dfd3eb\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:08.210811 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.210765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:08.345627 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.345597 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs"] Apr 17 17:17:08.347511 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:17:08.347472 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ed6e24_4094_4171_8cf1_63db13dfd3eb.slice/crio-fd913fd69d5f94c48c719ac842fd6256399a665f534a98426f7828ed719baa94 WatchSource:0}: Error finding container fd913fd69d5f94c48c719ac842fd6256399a665f534a98426f7828ed719baa94: Status 404 returned error can't find the container with id fd913fd69d5f94c48c719ac842fd6256399a665f534a98426f7828ed719baa94 Apr 17 17:17:08.917448 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:08.917408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" event={"ID":"f5ed6e24-4094-4171-8cf1-63db13dfd3eb","Type":"ContainerStarted","Data":"fd913fd69d5f94c48c719ac842fd6256399a665f534a98426f7828ed719baa94"} Apr 17 17:17:10.928401 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:10.928359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" event={"ID":"f5ed6e24-4094-4171-8cf1-63db13dfd3eb","Type":"ContainerStarted","Data":"4e0691ef480eb42dea3738fae989475584ec140f006a391f411fa31619890807"} Apr 17 17:17:10.928852 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:10.928447 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:10.947402 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:10.947340 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" podStartSLOduration=1.53078023 podStartE2EDuration="3.947320885s" podCreationTimestamp="2026-04-17 17:17:07 +0000 UTC" firstStartedPulling="2026-04-17 17:17:08.34974323 +0000 UTC m=+556.017426862" lastFinishedPulling="2026-04-17 17:17:10.766283882 +0000 UTC m=+558.433967517" observedRunningTime="2026-04-17 17:17:10.946067067 +0000 UTC m=+558.613750719" watchObservedRunningTime="2026-04-17 17:17:10.947320885 +0000 UTC m=+558.615004541" Apr 17 17:17:17.913513 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:17.913480 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:18.213517 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.213477 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql"] Apr 17 17:17:18.213807 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.213746 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" containerName="manager" containerID="cri-o://a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51" gracePeriod=2 Apr 17 17:17:18.230189 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.229795 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql"] Apr 17 17:17:18.234864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.234167 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:17:18.234864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.234614 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" containerName="manager" Apr 17 17:17:18.234864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.234633 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" containerName="manager" Apr 17 17:17:18.234864 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.234742 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" containerName="manager" Apr 17 17:17:18.239296 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.239268 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.244652 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.244621 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7gv2w\"" Apr 17 17:17:18.250629 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.250579 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx"] Apr 17 17:17:18.255279 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.255249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.257335 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.257302 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:17:18.274988 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.274954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx"] Apr 17 17:17:18.289266 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.289195 2572 status_manager.go:895] "Failed to get status for pod" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" err="pods \"limitador-operator-controller-manager-85c4996f8c-6vjql\" is forbidden: User \"system:node:ip-10-0-138-224.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-224.ec2.internal' and this object" Apr 17 17:17:18.291614 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.291570 2572 status_manager.go:895] "Failed to get status for pod" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" err="pods \"limitador-operator-controller-manager-85c4996f8c-6vjql\" is forbidden: User \"system:node:ip-10-0-138-224.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-224.ec2.internal' and this object" Apr 17 17:17:18.326103 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.326064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrcf\" (UniqueName: \"kubernetes.io/projected/ae6e8b6f-740c-4964-b8b7-e9cc7e48c340-kube-api-access-9zrcf\") pod \"limitador-operator-controller-manager-85c4996f8c-nqghx\" (UID: \"ae6e8b6f-740c-4964-b8b7-e9cc7e48c340\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.326320 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.326186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5mn\" (UniqueName: \"kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.326320 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.326300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.427857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.427815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5mn\" (UniqueName: \"kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.428049 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.427905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.428049 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.427973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrcf\" (UniqueName: \"kubernetes.io/projected/ae6e8b6f-740c-4964-b8b7-e9cc7e48c340-kube-api-access-9zrcf\") pod \"limitador-operator-controller-manager-85c4996f8c-nqghx\" (UID: \"ae6e8b6f-740c-4964-b8b7-e9cc7e48c340\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.428364 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.428342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.436270 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.436227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5mn\" (UniqueName: \"kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-5mjzz\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.436748 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.436724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrcf\" (UniqueName: \"kubernetes.io/projected/ae6e8b6f-740c-4964-b8b7-e9cc7e48c340-kube-api-access-9zrcf\") pod \"limitador-operator-controller-manager-85c4996f8c-nqghx\" (UID: \"ae6e8b6f-740c-4964-b8b7-e9cc7e48c340\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.457510 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.457474 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:18.463072 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.463032 2572 status_manager.go:895] "Failed to get status for pod" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" err="pods \"limitador-operator-controller-manager-85c4996f8c-6vjql\" is forbidden: User \"system:node:ip-10-0-138-224.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-224.ec2.internal' and this object" Apr 17 17:17:18.528607 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.528501 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckljb\" (UniqueName: \"kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb\") pod \"29f3c9a1-23ca-476e-87fe-3a0592be9512\" (UID: \"29f3c9a1-23ca-476e-87fe-3a0592be9512\") " Apr 17 17:17:18.530728 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.530695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb" (OuterVolumeSpecName: "kube-api-access-ckljb") pod "29f3c9a1-23ca-476e-87fe-3a0592be9512" (UID: "29f3c9a1-23ca-476e-87fe-3a0592be9512"). InnerVolumeSpecName "kube-api-access-ckljb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:17:18.584459 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.584415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:18.593339 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.593303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.630318 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.630276 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ckljb\" (UniqueName: \"kubernetes.io/projected/29f3c9a1-23ca-476e-87fe-3a0592be9512-kube-api-access-ckljb\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:18.746814 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.746785 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:17:18.748739 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:17:18.748710 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640d5373_8271_46d1_9815_0561ce942e3b.slice/crio-a7a0ebd04b97960eae3e6284e5ed3ccdae3607fb466d7af547ba8e754c58db1f WatchSource:0}: Error finding container a7a0ebd04b97960eae3e6284e5ed3ccdae3607fb466d7af547ba8e754c58db1f: Status 404 returned error can't find the container with id a7a0ebd04b97960eae3e6284e5ed3ccdae3607fb466d7af547ba8e754c58db1f Apr 17 17:17:18.764010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.763980 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx"] Apr 17 17:17:18.766050 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:17:18.766014 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6e8b6f_740c_4964_b8b7_e9cc7e48c340.slice/crio-b8b04366e6bfabee846c46529871b51fb3e114588dd61b31ef1ea6ef115e4ebf WatchSource:0}: Error finding container b8b04366e6bfabee846c46529871b51fb3e114588dd61b31ef1ea6ef115e4ebf: Status 404 returned error can't find the container with id b8b04366e6bfabee846c46529871b51fb3e114588dd61b31ef1ea6ef115e4ebf Apr 17 17:17:18.950740 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.950702 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f3c9a1-23ca-476e-87fe-3a0592be9512" path="/var/lib/kubelet/pods/29f3c9a1-23ca-476e-87fe-3a0592be9512/volumes" Apr 17 17:17:18.962844 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.962811 2572 generic.go:358] "Generic (PLEG): container finished" podID="29f3c9a1-23ca-476e-87fe-3a0592be9512" containerID="a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51" exitCode=0 Apr 17 17:17:18.963027 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.962865 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6vjql" Apr 17 17:17:18.963027 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.962901 2572 scope.go:117] "RemoveContainer" containerID="a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51" Apr 17 17:17:18.964147 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.964117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" event={"ID":"640d5373-8271-46d1-9815-0561ce942e3b","Type":"ContainerStarted","Data":"a7a0ebd04b97960eae3e6284e5ed3ccdae3607fb466d7af547ba8e754c58db1f"} Apr 17 17:17:18.966368 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.966334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" event={"ID":"ae6e8b6f-740c-4964-b8b7-e9cc7e48c340","Type":"ContainerStarted","Data":"709cb8554a8115c8243cad347e9a286e190b3f0124c2d94bc038a3913dfe4610"} Apr 17 17:17:18.966499 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.966481 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:18.966569 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.966510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" event={"ID":"ae6e8b6f-740c-4964-b8b7-e9cc7e48c340","Type":"ContainerStarted","Data":"b8b04366e6bfabee846c46529871b51fb3e114588dd61b31ef1ea6ef115e4ebf"} Apr 17 17:17:18.972492 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.972467 2572 scope.go:117] "RemoveContainer" containerID="a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51" Apr 17 17:17:18.972786 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:17:18.972765 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51\": container with ID starting with a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51 not found: ID does not exist" containerID="a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51" Apr 17 17:17:18.972856 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.972795 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51"} err="failed to get container status \"a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51\": rpc error: code = NotFound desc = could not find container \"a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51\": container with ID starting with a5f33a8d01df49f64d7f49602148425992dda790aaab1ea5bd7bd76ab2362d51 not found: ID does not exist" Apr 17 17:17:18.986015 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:18.985948 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" podStartSLOduration=0.985929547 podStartE2EDuration="985.929547ms" podCreationTimestamp="2026-04-17 17:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:17:18.985078445 +0000 UTC m=+566.652762101" watchObservedRunningTime="2026-04-17 17:17:18.985929547 +0000 UTC m=+566.653613205" Apr 17 17:17:21.935798 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:21.935759 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-btkxs" Apr 17 17:17:22.985797 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:22.985751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" event={"ID":"640d5373-8271-46d1-9815-0561ce942e3b","Type":"ContainerStarted","Data":"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006"} Apr 17 17:17:22.986327 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:22.985820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:23.011594 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:23.011482 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" podStartSLOduration=1.007969361 podStartE2EDuration="5.011461675s" podCreationTimestamp="2026-04-17 17:17:18 +0000 UTC" firstStartedPulling="2026-04-17 17:17:18.751231965 +0000 UTC m=+566.418915597" lastFinishedPulling="2026-04-17 17:17:22.754724263 +0000 UTC m=+570.422407911" observedRunningTime="2026-04-17 17:17:23.009719669 +0000 UTC m=+570.677403406" watchObservedRunningTime="2026-04-17 17:17:23.011461675 +0000 UTC m=+570.679145331" Apr 17 17:17:29.974265 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:29.974228 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nqghx" Apr 17 17:17:29.985137 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:29.985068 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6644598d65-p9rsv" podUID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" containerName="console" containerID="cri-o://3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5" gracePeriod=15 Apr 17 17:17:30.240734 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.240707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6644598d65-p9rsv_8a464cdd-539a-4eb5-9ff8-9febb28a748e/console/0.log" Apr 17 17:17:30.240870 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.240774 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:17:30.338179 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338133 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338195 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf5lb\" (UniqueName: \"kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338266 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338326 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338358 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338390 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338441 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert\") pod \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\" (UID: \"8a464cdd-539a-4eb5-9ff8-9febb28a748e\") " Apr 17 17:17:30.338726 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config" (OuterVolumeSpecName: "console-config") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:17:30.338972 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338940 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:17:30.338972 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.338935 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca" (OuterVolumeSpecName: "service-ca") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:17:30.339133 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.339043 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:17:30.340761 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.340726 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:17:30.340761 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.340742 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb" (OuterVolumeSpecName: "kube-api-access-mf5lb") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "kube-api-access-mf5lb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:17:30.340938 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.340806 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8a464cdd-539a-4eb5-9ff8-9febb28a748e" (UID: "8a464cdd-539a-4eb5-9ff8-9febb28a748e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:17:30.439343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439301 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-oauth-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439335 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mf5lb\" (UniqueName: \"kubernetes.io/projected/8a464cdd-539a-4eb5-9ff8-9febb28a748e-kube-api-access-mf5lb\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439343 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439348 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-config\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439357 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-trusted-ca-bundle\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439369 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-service-ca\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439377 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a464cdd-539a-4eb5-9ff8-9febb28a748e-console-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:30.439596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:30.439388 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a464cdd-539a-4eb5-9ff8-9febb28a748e-oauth-serving-cert\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:17:31.021204 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021169 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6644598d65-p9rsv_8a464cdd-539a-4eb5-9ff8-9febb28a748e/console/0.log" Apr 17 17:17:31.021653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021241 2572 generic.go:358] "Generic (PLEG): container finished" podID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" containerID="3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5" exitCode=2 Apr 17 17:17:31.021653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021299 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6644598d65-p9rsv" Apr 17 17:17:31.021653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6644598d65-p9rsv" event={"ID":"8a464cdd-539a-4eb5-9ff8-9febb28a748e","Type":"ContainerDied","Data":"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5"} Apr 17 17:17:31.021653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6644598d65-p9rsv" event={"ID":"8a464cdd-539a-4eb5-9ff8-9febb28a748e","Type":"ContainerDied","Data":"b44ec9ff2ca9e5a04f9a2d87ef4d49448bbdd2e0f50981041fa2428a7d45bd7b"} Apr 17 17:17:31.021653 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.021373 2572 scope.go:117] "RemoveContainer" containerID="3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5" Apr 17 17:17:31.032879 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.032853 2572 scope.go:117] "RemoveContainer" containerID="3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5" Apr 17 17:17:31.033232 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:17:31.033190 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5\": container with ID starting with 3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5 not found: ID does not exist" containerID="3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5" Apr 17 17:17:31.033309 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.033239 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5"} err="failed to get container status \"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5\": rpc error: code = NotFound desc = could not find container \"3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5\": container with ID starting with 3b416bf1ee46a8c3d7fe65efb62a0e90d3f8d6f7817e112f3a0166f75497fcb5 not found: ID does not exist" Apr 17 17:17:31.040991 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.040951 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:17:31.043755 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:31.043723 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6644598d65-p9rsv"] Apr 17 17:17:32.950668 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:32.950629 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" path="/var/lib/kubelet/pods/8a464cdd-539a-4eb5-9ff8-9febb28a748e/volumes" Apr 17 17:17:33.992839 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:33.992796 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:17:52.863238 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:52.863192 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:17:52.863739 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:17:52.863687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:18:43.776042 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.775979 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:18:43.776710 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.776687 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" containerName="console" Apr 17 17:18:43.776992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.776713 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" containerName="console" Apr 17 17:18:43.776992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.776800 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a464cdd-539a-4eb5-9ff8-9febb28a748e" containerName="console" Apr 17 17:18:43.780278 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.780248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:43.783530 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.783501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wkvz6\"" Apr 17 17:18:43.783530 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.783525 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 17:18:43.792860 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.792830 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:18:43.888651 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.888610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2rx\" (UniqueName: \"kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx\") pod \"maas-controller-68c6f48597-xtmk2\" (UID: \"2b61e0cc-202f-4a26-bbd4-9f90c147c267\") " pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:43.989274 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.989194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2rx\" (UniqueName: \"kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx\") pod \"maas-controller-68c6f48597-xtmk2\" (UID: \"2b61e0cc-202f-4a26-bbd4-9f90c147c267\") " pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:43.997355 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:43.997323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2rx\" (UniqueName: \"kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx\") pod \"maas-controller-68c6f48597-xtmk2\" (UID: \"2b61e0cc-202f-4a26-bbd4-9f90c147c267\") " pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:44.093286 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.093179 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:44.242937 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.242898 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:18:44.244444 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:18:44.244411 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b61e0cc_202f_4a26_bbd4_9f90c147c267.slice/crio-4d45e6a980cf86b06b23232c2b0d995b2a59e2cd455286ca1bc3cc0ba911c6dc WatchSource:0}: Error finding container 4d45e6a980cf86b06b23232c2b0d995b2a59e2cd455286ca1bc3cc0ba911c6dc: Status 404 returned error can't find the container with id 4d45e6a980cf86b06b23232c2b0d995b2a59e2cd455286ca1bc3cc0ba911c6dc Apr 17 17:18:44.245941 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.245921 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:18:44.332366 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.332313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68c6f48597-xtmk2" event={"ID":"2b61e0cc-202f-4a26-bbd4-9f90c147c267","Type":"ContainerStarted","Data":"4d45e6a980cf86b06b23232c2b0d995b2a59e2cd455286ca1bc3cc0ba911c6dc"} Apr 17 17:18:44.603979 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.603938 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-77cd854497-p8kqx"] Apr 17 17:18:44.608821 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.608793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.611514 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.611453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7k4fn\"" Apr 17 17:18:44.611688 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.611622 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 17:18:44.614835 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.614808 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-77cd854497-p8kqx"] Apr 17 17:18:44.695703 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.695665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9w4\" (UniqueName: \"kubernetes.io/projected/459cb84c-3e80-4510-a443-f0518e1a38e8-kube-api-access-2p9w4\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.695917 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.695724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/459cb84c-3e80-4510-a443-f0518e1a38e8-maas-api-tls\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.796529 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.796494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9w4\" (UniqueName: \"kubernetes.io/projected/459cb84c-3e80-4510-a443-f0518e1a38e8-kube-api-access-2p9w4\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.797010 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.796570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/459cb84c-3e80-4510-a443-f0518e1a38e8-maas-api-tls\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.799698 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.799662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/459cb84c-3e80-4510-a443-f0518e1a38e8-maas-api-tls\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.806917 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.806882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9w4\" (UniqueName: \"kubernetes.io/projected/459cb84c-3e80-4510-a443-f0518e1a38e8-kube-api-access-2p9w4\") pod \"maas-api-77cd854497-p8kqx\" (UID: \"459cb84c-3e80-4510-a443-f0518e1a38e8\") " pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:44.921370 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:44.921271 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:45.091596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:45.091556 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-77cd854497-p8kqx"] Apr 17 17:18:45.093336 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:18:45.093299 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459cb84c_3e80_4510_a443_f0518e1a38e8.slice/crio-840c550d55b57a9eeab7611788f783301bcfbb700c1d63db83359cd5db58d970 WatchSource:0}: Error finding container 840c550d55b57a9eeab7611788f783301bcfbb700c1d63db83359cd5db58d970: Status 404 returned error can't find the container with id 840c550d55b57a9eeab7611788f783301bcfbb700c1d63db83359cd5db58d970 Apr 17 17:18:45.338878 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:45.338827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-77cd854497-p8kqx" event={"ID":"459cb84c-3e80-4510-a443-f0518e1a38e8","Type":"ContainerStarted","Data":"840c550d55b57a9eeab7611788f783301bcfbb700c1d63db83359cd5db58d970"} Apr 17 17:18:47.348612 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.348499 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-77cd854497-p8kqx" event={"ID":"459cb84c-3e80-4510-a443-f0518e1a38e8","Type":"ContainerStarted","Data":"562f3202959276403f4e67cb81e7de922a11a9cf6400ee53307c6618244f757f"} Apr 17 17:18:47.348612 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.348605 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:47.350144 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.350110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68c6f48597-xtmk2" event={"ID":"2b61e0cc-202f-4a26-bbd4-9f90c147c267","Type":"ContainerStarted","Data":"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30"} Apr 17 17:18:47.350323 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.350249 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:47.374407 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.374337 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-77cd854497-p8kqx" podStartSLOduration=1.418060418 podStartE2EDuration="3.374316559s" podCreationTimestamp="2026-04-17 17:18:44 +0000 UTC" firstStartedPulling="2026-04-17 17:18:45.094947782 +0000 UTC m=+652.762631419" lastFinishedPulling="2026-04-17 17:18:47.051203914 +0000 UTC m=+654.718887560" observedRunningTime="2026-04-17 17:18:47.372373899 +0000 UTC m=+655.040057552" watchObservedRunningTime="2026-04-17 17:18:47.374316559 +0000 UTC m=+655.042000214" Apr 17 17:18:47.391669 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:47.391612 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-68c6f48597-xtmk2" podStartSLOduration=1.5893336150000001 podStartE2EDuration="4.391594062s" podCreationTimestamp="2026-04-17 17:18:43 +0000 UTC" firstStartedPulling="2026-04-17 17:18:44.246138627 +0000 UTC m=+651.913822266" lastFinishedPulling="2026-04-17 17:18:47.048399079 +0000 UTC m=+654.716082713" observedRunningTime="2026-04-17 17:18:47.388511496 +0000 UTC m=+655.056195146" watchObservedRunningTime="2026-04-17 17:18:47.391594062 +0000 UTC m=+655.059277724" Apr 17 17:18:53.361347 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:53.361315 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-77cd854497-p8kqx" Apr 17 17:18:58.360873 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.360837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:18:58.650719 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.650604 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:18:58.654326 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.654300 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:18:58.663058 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.663029 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:18:58.722427 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.722378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747kl\" (UniqueName: \"kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl\") pod \"maas-controller-84979b9f54-f7cb9\" (UID: \"184b5230-2f56-4173-85d8-c3ad2269f4b0\") " pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:18:58.823586 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.823542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-747kl\" (UniqueName: \"kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl\") pod \"maas-controller-84979b9f54-f7cb9\" (UID: \"184b5230-2f56-4173-85d8-c3ad2269f4b0\") " pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:18:58.833008 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.832975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-747kl\" (UniqueName: \"kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl\") pod \"maas-controller-84979b9f54-f7cb9\" (UID: \"184b5230-2f56-4173-85d8-c3ad2269f4b0\") " pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:18:58.966594 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:58.966528 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:18:59.101539 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:59.101498 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:18:59.103249 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:18:59.103201 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184b5230_2f56_4173_85d8_c3ad2269f4b0.slice/crio-61700462fb11b0cecc80314ccf35fb31d61a5a318f6ef4c41b912c82ae4907af WatchSource:0}: Error finding container 61700462fb11b0cecc80314ccf35fb31d61a5a318f6ef4c41b912c82ae4907af: Status 404 returned error can't find the container with id 61700462fb11b0cecc80314ccf35fb31d61a5a318f6ef4c41b912c82ae4907af Apr 17 17:18:59.408014 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:18:59.407971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-f7cb9" event={"ID":"184b5230-2f56-4173-85d8-c3ad2269f4b0","Type":"ContainerStarted","Data":"61700462fb11b0cecc80314ccf35fb31d61a5a318f6ef4c41b912c82ae4907af"} Apr 17 17:19:00.414204 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:00.414153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-f7cb9" event={"ID":"184b5230-2f56-4173-85d8-c3ad2269f4b0","Type":"ContainerStarted","Data":"fb9b0447b64ec75350b01506dbf2f46dd7bb29f2a60313369e2a13e306952af8"} Apr 17 17:19:00.414698 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:00.414374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:19:00.431787 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:00.431724 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-84979b9f54-f7cb9" podStartSLOduration=2.146334304 podStartE2EDuration="2.431707499s" podCreationTimestamp="2026-04-17 17:18:58 +0000 UTC" firstStartedPulling="2026-04-17 17:18:59.104625115 +0000 UTC m=+666.772308747" lastFinishedPulling="2026-04-17 17:18:59.389998303 +0000 UTC m=+667.057681942" observedRunningTime="2026-04-17 17:19:00.429192712 +0000 UTC m=+668.096876368" watchObservedRunningTime="2026-04-17 17:19:00.431707499 +0000 UTC m=+668.099391168" Apr 17 17:19:11.425130 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.425092 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:19:11.464354 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.464320 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:19:11.464618 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.464593 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-68c6f48597-xtmk2" podUID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" containerName="manager" containerID="cri-o://fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30" gracePeriod=10 Apr 17 17:19:11.720854 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.720827 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:19:11.846752 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.846715 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2rx\" (UniqueName: \"kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx\") pod \"2b61e0cc-202f-4a26-bbd4-9f90c147c267\" (UID: \"2b61e0cc-202f-4a26-bbd4-9f90c147c267\") " Apr 17 17:19:11.849057 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.849015 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx" (OuterVolumeSpecName: "kube-api-access-xl2rx") pod "2b61e0cc-202f-4a26-bbd4-9f90c147c267" (UID: "2b61e0cc-202f-4a26-bbd4-9f90c147c267"). InnerVolumeSpecName "kube-api-access-xl2rx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:19:11.947542 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:11.947498 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl2rx\" (UniqueName: \"kubernetes.io/projected/2b61e0cc-202f-4a26-bbd4-9f90c147c267-kube-api-access-xl2rx\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:19:12.469732 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.469697 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" containerID="fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30" exitCode=0 Apr 17 17:19:12.470154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.469773 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68c6f48597-xtmk2" Apr 17 17:19:12.470154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.469787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68c6f48597-xtmk2" event={"ID":"2b61e0cc-202f-4a26-bbd4-9f90c147c267","Type":"ContainerDied","Data":"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30"} Apr 17 17:19:12.470154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.469827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68c6f48597-xtmk2" event={"ID":"2b61e0cc-202f-4a26-bbd4-9f90c147c267","Type":"ContainerDied","Data":"4d45e6a980cf86b06b23232c2b0d995b2a59e2cd455286ca1bc3cc0ba911c6dc"} Apr 17 17:19:12.470154 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.469843 2572 scope.go:117] "RemoveContainer" containerID="fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30" Apr 17 17:19:12.479947 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.479927 2572 scope.go:117] "RemoveContainer" containerID="fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30" Apr 17 17:19:12.480276 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:19:12.480256 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30\": container with ID starting with fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30 not found: ID does not exist" containerID="fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30" Apr 17 17:19:12.480329 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.480286 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30"} err="failed to get container status \"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30\": rpc error: code = NotFound desc = could not find container \"fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30\": container with ID starting with fec7891a6ac7e1f1b26e574de41c33b6c1401423e83658b71c4aa513e1e49b30 not found: ID does not exist" Apr 17 17:19:12.492485 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.492448 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:19:12.495796 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.495766 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-68c6f48597-xtmk2"] Apr 17 17:19:12.958854 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:12.958813 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" path="/var/lib/kubelet/pods/2b61e0cc-202f-4a26-bbd4-9f90c147c267/volumes" Apr 17 17:19:33.028372 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.028332 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l"] Apr 17 17:19:33.029018 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.028935 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" containerName="manager" Apr 17 17:19:33.029018 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.028959 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" containerName="manager" Apr 17 17:19:33.029142 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.029069 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b61e0cc-202f-4a26-bbd4-9f90c147c267" containerName="manager" Apr 17 17:19:33.034023 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.033996 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.037752 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.037714 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:19:33.037752 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.037719 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:19:33.037992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.037780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gnhc5\"" Apr 17 17:19:33.037992 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.037845 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 17:19:33.045227 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.044111 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l"] Apr 17 17:19:33.150180 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.150180 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.150425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.150425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.150425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.150425 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.150353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4pk\" (UniqueName: \"kubernetes.io/projected/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kube-api-access-bh4pk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251017 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.250976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251017 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251273 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251273 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251273 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251273 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4pk\" (UniqueName: \"kubernetes.io/projected/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kube-api-access-bh4pk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251568 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251568 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.251748 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.251590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.253494 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.253471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.253931 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.253907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.259036 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.259008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4pk\" (UniqueName: \"kubernetes.io/projected/3c2bdefa-a97d-4c1e-9531-ca2cb44b0798-kube-api-access-bh4pk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l\" (UID: \"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.355460 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.355365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:33.503784 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.503753 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l"] Apr 17 17:19:33.504901 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:19:33.504867 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c2bdefa_a97d_4c1e_9531_ca2cb44b0798.slice/crio-e81bdc0d6f10667fba99e152b700eba5221021fd5a830207ad50d0cf42b1bebf WatchSource:0}: Error finding container e81bdc0d6f10667fba99e152b700eba5221021fd5a830207ad50d0cf42b1bebf: Status 404 returned error can't find the container with id e81bdc0d6f10667fba99e152b700eba5221021fd5a830207ad50d0cf42b1bebf Apr 17 17:19:33.562590 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:33.562535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" event={"ID":"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798","Type":"ContainerStarted","Data":"e81bdc0d6f10667fba99e152b700eba5221021fd5a830207ad50d0cf42b1bebf"} Apr 17 17:19:39.596354 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:39.596311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" event={"ID":"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798","Type":"ContainerStarted","Data":"3fab800a1febe792efa118ead08d385469bd9d9f0719b003c1222cd67b6ffe3e"} Apr 17 17:19:48.640538 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:48.640501 2572 generic.go:358] "Generic (PLEG): container finished" podID="3c2bdefa-a97d-4c1e-9531-ca2cb44b0798" containerID="3fab800a1febe792efa118ead08d385469bd9d9f0719b003c1222cd67b6ffe3e" exitCode=0 Apr 17 17:19:48.640942 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:48.640577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" event={"ID":"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798","Type":"ContainerDied","Data":"3fab800a1febe792efa118ead08d385469bd9d9f0719b003c1222cd67b6ffe3e"} Apr 17 17:19:50.653163 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:50.653118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" event={"ID":"3c2bdefa-a97d-4c1e-9531-ca2cb44b0798","Type":"ContainerStarted","Data":"201ac7ac7afdf757d8dbb9011df63ee7039fd80774e35947bd0f4328a3b1573d"} Apr 17 17:19:50.653720 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:50.653367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:19:50.673888 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:50.673826 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" podStartSLOduration=1.32250739 podStartE2EDuration="17.673808619s" podCreationTimestamp="2026-04-17 17:19:33 +0000 UTC" firstStartedPulling="2026-04-17 17:19:33.506625475 +0000 UTC m=+701.174309107" lastFinishedPulling="2026-04-17 17:19:49.857926429 +0000 UTC m=+717.525610336" observedRunningTime="2026-04-17 17:19:50.671086882 +0000 UTC m=+718.338770548" watchObservedRunningTime="2026-04-17 17:19:50.673808619 +0000 UTC m=+718.341492314" Apr 17 17:19:52.036729 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.036688 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx"] Apr 17 17:19:52.040603 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.040579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.043126 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.043104 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 17:19:52.052965 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.052938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx"] Apr 17 17:19:52.140111 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.140353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.140353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.140353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.140353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01bede85-f446-42d6-ad57-72ef778eaa18-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.140520 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.140366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbjj\" (UniqueName: \"kubernetes.io/projected/01bede85-f446-42d6-ad57-72ef778eaa18-kube-api-access-vmbjj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241539 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241638 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01bede85-f446-42d6-ad57-72ef778eaa18-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.241760 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.241725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbjj\" (UniqueName: \"kubernetes.io/projected/01bede85-f446-42d6-ad57-72ef778eaa18-kube-api-access-vmbjj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.242033 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.242006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.242091 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.242024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.242150 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.242111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.243983 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.243954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01bede85-f446-42d6-ad57-72ef778eaa18-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.244342 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.244320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01bede85-f446-42d6-ad57-72ef778eaa18-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.250038 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.250011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbjj\" (UniqueName: \"kubernetes.io/projected/01bede85-f446-42d6-ad57-72ef778eaa18-kube-api-access-vmbjj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx\" (UID: \"01bede85-f446-42d6-ad57-72ef778eaa18\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.352531 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.352423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:52.507903 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.507872 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx"] Apr 17 17:19:52.509189 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:19:52.509158 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bede85_f446_42d6_ad57_72ef778eaa18.slice/crio-ccd82ad779e121424b149e9c5735e739d217b29b5bf07a0a91c9eafc4614edc0 WatchSource:0}: Error finding container ccd82ad779e121424b149e9c5735e739d217b29b5bf07a0a91c9eafc4614edc0: Status 404 returned error can't find the container with id ccd82ad779e121424b149e9c5735e739d217b29b5bf07a0a91c9eafc4614edc0 Apr 17 17:19:52.663538 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.663428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" event={"ID":"01bede85-f446-42d6-ad57-72ef778eaa18","Type":"ContainerStarted","Data":"3a3d1eb0b92f9100a7b881c6ff7ec7c2094d7d92ab3df08579c8e5dd62835f26"} Apr 17 17:19:52.663538 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:52.663476 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" event={"ID":"01bede85-f446-42d6-ad57-72ef778eaa18","Type":"ContainerStarted","Data":"ccd82ad779e121424b149e9c5735e739d217b29b5bf07a0a91c9eafc4614edc0"} Apr 17 17:19:58.693917 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:58.693872 2572 generic.go:358] "Generic (PLEG): container finished" podID="01bede85-f446-42d6-ad57-72ef778eaa18" containerID="3a3d1eb0b92f9100a7b881c6ff7ec7c2094d7d92ab3df08579c8e5dd62835f26" exitCode=0 Apr 17 17:19:58.694434 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:58.693918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" event={"ID":"01bede85-f446-42d6-ad57-72ef778eaa18","Type":"ContainerDied","Data":"3a3d1eb0b92f9100a7b881c6ff7ec7c2094d7d92ab3df08579c8e5dd62835f26"} Apr 17 17:19:59.700780 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:59.700726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" event={"ID":"01bede85-f446-42d6-ad57-72ef778eaa18","Type":"ContainerStarted","Data":"953573cdc28209e55010546e37f0514cf9ead47171a462905accb37df8ec3dbb"} Apr 17 17:19:59.701701 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:59.700995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:19:59.721588 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:19:59.721514 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" podStartSLOduration=7.348738015 podStartE2EDuration="7.721492341s" podCreationTimestamp="2026-04-17 17:19:52 +0000 UTC" firstStartedPulling="2026-04-17 17:19:58.694733017 +0000 UTC m=+726.362416649" lastFinishedPulling="2026-04-17 17:19:59.067487343 +0000 UTC m=+726.735170975" observedRunningTime="2026-04-17 17:19:59.718869237 +0000 UTC m=+727.386552891" watchObservedRunningTime="2026-04-17 17:19:59.721492341 +0000 UTC m=+727.389175996" Apr 17 17:20:01.670642 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:20:01.670612 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l" Apr 17 17:20:10.719534 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:20:10.719494 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx" Apr 17 17:21:47.992228 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:47.991432 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:21:47.992228 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:47.991830 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-84979b9f54-f7cb9" podUID="184b5230-2f56-4173-85d8-c3ad2269f4b0" containerName="manager" containerID="cri-o://fb9b0447b64ec75350b01506dbf2f46dd7bb29f2a60313369e2a13e306952af8" gracePeriod=10 Apr 17 17:21:48.168353 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.168309 2572 generic.go:358] "Generic (PLEG): container finished" podID="184b5230-2f56-4173-85d8-c3ad2269f4b0" containerID="fb9b0447b64ec75350b01506dbf2f46dd7bb29f2a60313369e2a13e306952af8" exitCode=0 Apr 17 17:21:48.168498 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.168364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-f7cb9" event={"ID":"184b5230-2f56-4173-85d8-c3ad2269f4b0","Type":"ContainerDied","Data":"fb9b0447b64ec75350b01506dbf2f46dd7bb29f2a60313369e2a13e306952af8"} Apr 17 17:21:48.257838 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.257759 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:21:48.344787 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.344751 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-747kl\" (UniqueName: \"kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl\") pod \"184b5230-2f56-4173-85d8-c3ad2269f4b0\" (UID: \"184b5230-2f56-4173-85d8-c3ad2269f4b0\") " Apr 17 17:21:48.346994 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.346956 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl" (OuterVolumeSpecName: "kube-api-access-747kl") pod "184b5230-2f56-4173-85d8-c3ad2269f4b0" (UID: "184b5230-2f56-4173-85d8-c3ad2269f4b0"). InnerVolumeSpecName "kube-api-access-747kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:21:48.446319 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:48.446275 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-747kl\" (UniqueName: \"kubernetes.io/projected/184b5230-2f56-4173-85d8-c3ad2269f4b0-kube-api-access-747kl\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:21:49.174526 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.174497 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-f7cb9" Apr 17 17:21:49.174955 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.174496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-f7cb9" event={"ID":"184b5230-2f56-4173-85d8-c3ad2269f4b0","Type":"ContainerDied","Data":"61700462fb11b0cecc80314ccf35fb31d61a5a318f6ef4c41b912c82ae4907af"} Apr 17 17:21:49.174955 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.174612 2572 scope.go:117] "RemoveContainer" containerID="fb9b0447b64ec75350b01506dbf2f46dd7bb29f2a60313369e2a13e306952af8" Apr 17 17:21:49.193617 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.193583 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:21:49.198830 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.198797 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-84979b9f54-f7cb9"] Apr 17 17:21:49.883703 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.883663 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-84979b9f54-dcn74"] Apr 17 17:21:49.884073 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.884060 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="184b5230-2f56-4173-85d8-c3ad2269f4b0" containerName="manager" Apr 17 17:21:49.884121 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.884075 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="184b5230-2f56-4173-85d8-c3ad2269f4b0" containerName="manager" Apr 17 17:21:49.884163 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.884153 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="184b5230-2f56-4173-85d8-c3ad2269f4b0" containerName="manager" Apr 17 17:21:49.888713 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.888685 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:49.891074 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.891046 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wkvz6\"" Apr 17 17:21:49.899818 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:49.899784 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84979b9f54-dcn74"] Apr 17 17:21:50.062599 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.062557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jcsk\" (UniqueName: \"kubernetes.io/projected/c138494f-f8d0-407a-8b45-48872c23d5c8-kube-api-access-4jcsk\") pod \"maas-controller-84979b9f54-dcn74\" (UID: \"c138494f-f8d0-407a-8b45-48872c23d5c8\") " pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:50.163646 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.163541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jcsk\" (UniqueName: \"kubernetes.io/projected/c138494f-f8d0-407a-8b45-48872c23d5c8-kube-api-access-4jcsk\") pod \"maas-controller-84979b9f54-dcn74\" (UID: \"c138494f-f8d0-407a-8b45-48872c23d5c8\") " pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:50.172507 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.172485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jcsk\" (UniqueName: \"kubernetes.io/projected/c138494f-f8d0-407a-8b45-48872c23d5c8-kube-api-access-4jcsk\") pod \"maas-controller-84979b9f54-dcn74\" (UID: \"c138494f-f8d0-407a-8b45-48872c23d5c8\") " pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:50.201051 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.201009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:50.338007 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.337977 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84979b9f54-dcn74"] Apr 17 17:21:50.339725 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:21:50.339687 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc138494f_f8d0_407a_8b45_48872c23d5c8.slice/crio-f690f70d1b0c6e811b17e2e54066ad837fa7e493bfde3fb9046b668ac171d3d5 WatchSource:0}: Error finding container f690f70d1b0c6e811b17e2e54066ad837fa7e493bfde3fb9046b668ac171d3d5: Status 404 returned error can't find the container with id f690f70d1b0c6e811b17e2e54066ad837fa7e493bfde3fb9046b668ac171d3d5 Apr 17 17:21:50.950635 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:50.950599 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184b5230-2f56-4173-85d8-c3ad2269f4b0" path="/var/lib/kubelet/pods/184b5230-2f56-4173-85d8-c3ad2269f4b0/volumes" Apr 17 17:21:51.185823 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:51.185774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-dcn74" event={"ID":"c138494f-f8d0-407a-8b45-48872c23d5c8","Type":"ContainerStarted","Data":"6c76d2e8daafa993479ae4d8ba5f311c977307cbede164698f7e7182446e80c6"} Apr 17 17:21:51.185823 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:51.185824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84979b9f54-dcn74" event={"ID":"c138494f-f8d0-407a-8b45-48872c23d5c8","Type":"ContainerStarted","Data":"f690f70d1b0c6e811b17e2e54066ad837fa7e493bfde3fb9046b668ac171d3d5"} Apr 17 17:21:51.186053 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:51.185853 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:21:51.203357 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:21:51.203237 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-84979b9f54-dcn74" podStartSLOduration=1.545936185 podStartE2EDuration="2.203199276s" podCreationTimestamp="2026-04-17 17:21:49 +0000 UTC" firstStartedPulling="2026-04-17 17:21:50.341423663 +0000 UTC m=+838.009107296" lastFinishedPulling="2026-04-17 17:21:50.998686752 +0000 UTC m=+838.666370387" observedRunningTime="2026-04-17 17:21:51.200923572 +0000 UTC m=+838.868607229" watchObservedRunningTime="2026-04-17 17:21:51.203199276 +0000 UTC m=+838.870882992" Apr 17 17:22:02.199894 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:22:02.199858 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-84979b9f54-dcn74" Apr 17 17:22:52.900989 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:22:52.900956 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:22:52.901897 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:22:52.901876 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:27:52.933199 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:27:52.933174 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:27:52.936759 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:27:52.936733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:31:58.834359 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:58.834321 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:31:58.834885 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:58.834550 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" podUID="640d5373-8271-46d1-9815-0561ce942e3b" containerName="manager" containerID="cri-o://52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006" gracePeriod=10 Apr 17 17:31:59.388337 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.388314 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:31:59.527259 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.527176 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5mn\" (UniqueName: \"kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn\") pod \"640d5373-8271-46d1-9815-0561ce942e3b\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " Apr 17 17:31:59.527259 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.527240 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume\") pod \"640d5373-8271-46d1-9815-0561ce942e3b\" (UID: \"640d5373-8271-46d1-9815-0561ce942e3b\") " Apr 17 17:31:59.527608 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.527586 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "640d5373-8271-46d1-9815-0561ce942e3b" (UID: "640d5373-8271-46d1-9815-0561ce942e3b"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:59.529269 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.529246 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn" (OuterVolumeSpecName: "kube-api-access-hx5mn") pod "640d5373-8271-46d1-9815-0561ce942e3b" (UID: "640d5373-8271-46d1-9815-0561ce942e3b"). InnerVolumeSpecName "kube-api-access-hx5mn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:59.628493 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.628470 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hx5mn\" (UniqueName: \"kubernetes.io/projected/640d5373-8271-46d1-9815-0561ce942e3b-kube-api-access-hx5mn\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:31:59.628493 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.628492 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/640d5373-8271-46d1-9815-0561ce942e3b-extensions-socket-volume\") on node \"ip-10-0-138-224.ec2.internal\" DevicePath \"\"" Apr 17 17:31:59.652818 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.652788 2572 generic.go:358] "Generic (PLEG): container finished" podID="640d5373-8271-46d1-9815-0561ce942e3b" containerID="52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006" exitCode=0 Apr 17 17:31:59.652927 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.652848 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" Apr 17 17:31:59.652927 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.652876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" event={"ID":"640d5373-8271-46d1-9815-0561ce942e3b","Type":"ContainerDied","Data":"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006"} Apr 17 17:31:59.652927 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.652921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz" event={"ID":"640d5373-8271-46d1-9815-0561ce942e3b","Type":"ContainerDied","Data":"a7a0ebd04b97960eae3e6284e5ed3ccdae3607fb466d7af547ba8e754c58db1f"} Apr 17 17:31:59.653079 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.652942 2572 scope.go:117] "RemoveContainer" containerID="52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006" Apr 17 17:31:59.667166 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.667146 2572 scope.go:117] "RemoveContainer" containerID="52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006" Apr 17 17:31:59.667456 ip-10-0-138-224 kubenswrapper[2572]: E0417 17:31:59.667435 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006\": container with ID starting with 52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006 not found: ID does not exist" containerID="52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006" Apr 17 17:31:59.667519 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.667465 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006"} err="failed to get container status \"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006\": rpc error: code = NotFound desc = could not find container \"52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006\": container with ID starting with 52a5091ff9a94f8bba3a077bb1f6b00ab690bdeb7b59e899f107b5f7be9ee006 not found: ID does not exist" Apr 17 17:31:59.681062 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.681037 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:31:59.683422 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:31:59.683402 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-5mjzz"] Apr 17 17:32:00.950702 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:32:00.950659 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640d5373-8271-46d1-9815-0561ce942e3b" path="/var/lib/kubelet/pods/640d5373-8271-46d1-9815-0561ce942e3b/volumes" Apr 17 17:32:52.965750 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:32:52.965720 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:32:52.977886 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:32:52.977859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:33:04.910547 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.910509 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7"] Apr 17 17:33:04.910975 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.910959 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="640d5373-8271-46d1-9815-0561ce942e3b" containerName="manager" Apr 17 17:33:04.911027 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.910977 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="640d5373-8271-46d1-9815-0561ce942e3b" containerName="manager" Apr 17 17:33:04.911062 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.911039 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="640d5373-8271-46d1-9815-0561ce942e3b" containerName="manager" Apr 17 17:33:04.914336 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.914315 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:04.916769 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.916751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7gv2w\"" Apr 17 17:33:04.929093 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:04.929070 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7"] Apr 17 17:33:05.039525 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.039495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f370589-98a2-4131-bbf8-5dff6b31af90-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.039651 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.039534 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxvn\" (UniqueName: \"kubernetes.io/projected/3f370589-98a2-4131-bbf8-5dff6b31af90-kube-api-access-fkxvn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.140168 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.140140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxvn\" (UniqueName: \"kubernetes.io/projected/3f370589-98a2-4131-bbf8-5dff6b31af90-kube-api-access-fkxvn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.140301 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.140252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f370589-98a2-4131-bbf8-5dff6b31af90-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.140619 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.140602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f370589-98a2-4131-bbf8-5dff6b31af90-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.148720 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.148689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxvn\" (UniqueName: \"kubernetes.io/projected/3f370589-98a2-4131-bbf8-5dff6b31af90-kube-api-access-fkxvn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gczp7\" (UID: \"3f370589-98a2-4131-bbf8-5dff6b31af90\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.224481 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.224450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.357720 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.357692 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7"] Apr 17 17:33:05.359668 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:33:05.359639 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f370589_98a2_4131_bbf8_5dff6b31af90.slice/crio-749d482a3acbfe7bdd8aeb52ff39e50cddc71c03fed8f50b9a0b58c55bed6c4b WatchSource:0}: Error finding container 749d482a3acbfe7bdd8aeb52ff39e50cddc71c03fed8f50b9a0b58c55bed6c4b: Status 404 returned error can't find the container with id 749d482a3acbfe7bdd8aeb52ff39e50cddc71c03fed8f50b9a0b58c55bed6c4b Apr 17 17:33:05.362164 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.362143 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:33:05.921441 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.921404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" event={"ID":"3f370589-98a2-4131-bbf8-5dff6b31af90","Type":"ContainerStarted","Data":"4863016176e2e33a0c826b67f72edb9292230e18358ef54cdcd8f80066eff01c"} Apr 17 17:33:05.921441 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.921439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" event={"ID":"3f370589-98a2-4131-bbf8-5dff6b31af90","Type":"ContainerStarted","Data":"749d482a3acbfe7bdd8aeb52ff39e50cddc71c03fed8f50b9a0b58c55bed6c4b"} Apr 17 17:33:05.921943 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.921482 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:33:05.946809 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:05.946763 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" podStartSLOduration=1.9467486840000001 podStartE2EDuration="1.946748684s" podCreationTimestamp="2026-04-17 17:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:33:05.94336864 +0000 UTC m=+1513.611052287" watchObservedRunningTime="2026-04-17 17:33:05.946748684 +0000 UTC m=+1513.614432337" Apr 17 17:33:16.931370 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:33:16.931339 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gczp7" Apr 17 17:37:53.005165 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:37:53.005136 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:37:53.011786 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:37:53.011763 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:42:49.036853 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:49.036820 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zgh7t_a52b22a7-ef79-4ea3-9766-bb80d6394b58/manager/0.log" Apr 17 17:42:49.157424 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:49.157393 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-77cd854497-p8kqx_459cb84c-3e80-4510-a443-f0518e1a38e8/maas-api/0.log" Apr 17 17:42:49.278329 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:49.278302 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-84979b9f54-dcn74_c138494f-f8d0-407a-8b45-48872c23d5c8/manager/0.log" Apr 17 17:42:49.395878 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:49.395799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-tqt9s_543f91a3-6c9c-4a1b-837f-670ccec0ff35/manager/1.log" Apr 17 17:42:49.517366 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:49.517323 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-7qtnw_af5524b1-ba23-4941-96f9-faddbb864aa7/manager/0.log" Apr 17 17:42:50.603472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.603440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/util/0.log" Apr 17 17:42:50.610639 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.610608 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/pull/0.log" Apr 17 17:42:50.616929 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.616911 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/extract/0.log" Apr 17 17:42:50.740009 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.739981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/pull/0.log" Apr 17 17:42:50.746581 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.746557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/extract/0.log" Apr 17 17:42:50.752706 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.752690 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/util/0.log" Apr 17 17:42:50.853717 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.853668 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/util/0.log" Apr 17 17:42:50.859547 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.859529 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/pull/0.log" Apr 17 17:42:50.865179 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.865156 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/extract/0.log" Apr 17 17:42:50.974922 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.974893 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/extract/0.log" Apr 17 17:42:50.981983 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.981965 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/util/0.log" Apr 17 17:42:50.988873 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:50.988853 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/pull/0.log" Apr 17 17:42:51.327581 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:51.327555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-btkxs_f5ed6e24-4094-4171-8cf1-63db13dfd3eb/manager/0.log" Apr 17 17:42:51.675371 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:51.675344 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-gczp7_3f370589-98a2-4131-bbf8-5dff6b31af90/manager/0.log" Apr 17 17:42:51.908913 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:51.908878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-nqghx_ae6e8b6f-740c-4964-b8b7-e9cc7e48c340/manager/0.log" Apr 17 17:42:52.367696 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:52.367655 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gsx6s_e161db9b-4d23-41e2-9c33-155fcf18d401/discovery/0.log" Apr 17 17:42:52.578978 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:52.578952 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-db5457dbf-xgnb5_7190378e-cdc3-4581-b7a0-d93b9cd31af8/kube-auth-proxy/0.log" Apr 17 17:42:52.813724 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:52.813687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b86f88f8-qvj8p_394bfa3f-360e-4dbb-ab25-846a90b23983/router/0.log" Apr 17 17:42:53.036276 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.036250 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:42:53.044421 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.044397 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:42:53.153744 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.153676 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx_01bede85-f446-42d6-ad57-72ef778eaa18/main/0.log" Apr 17 17:42:53.161596 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.161569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-fd4qx_01bede85-f446-42d6-ad57-72ef778eaa18/storage-initializer/0.log" Apr 17 17:42:53.763942 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.763912 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l_3c2bdefa-a97d-4c1e-9531-ca2cb44b0798/storage-initializer/0.log" Apr 17 17:42:53.772024 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:42:53.771983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5kn6l_3c2bdefa-a97d-4c1e-9531-ca2cb44b0798/main/0.log" Apr 17 17:43:00.109875 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:00.109835 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zw6zj_f15ce690-dc22-41c5-a9f0-106b87ea9815/global-pull-secret-syncer/0.log" Apr 17 17:43:00.216281 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:00.216254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vf59c_c39d4ed7-fddd-4bc2-8cfb-8d0155da370b/konnectivity-agent/0.log" Apr 17 17:43:00.278737 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:00.278712 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-224.ec2.internal_03b99da47aa26d733084fbd12fd690dc/haproxy/0.log" Apr 17 17:43:03.815293 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.815194 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/extract/0.log" Apr 17 17:43:03.842486 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.842463 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/util/0.log" Apr 17 17:43:03.862770 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.862724 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759vdzbm_7a3f5eb4-82f0-4768-a1c6-a2ca68225d7c/pull/0.log" Apr 17 17:43:03.886634 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.886614 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/extract/0.log" Apr 17 17:43:03.905569 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.905546 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/util/0.log" Apr 17 17:43:03.927472 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.927444 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qk9g5_ea3ddf39-29a8-4769-a4af-984d468f356c/pull/0.log" Apr 17 17:43:03.957798 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.957780 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/extract/0.log" Apr 17 17:43:03.978038 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.978021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/util/0.log" Apr 17 17:43:03.999385 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:03.999363 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdqdk_2c845075-4b05-457f-a97f-3ca76399588e/pull/0.log" Apr 17 17:43:04.024302 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.024279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/extract/0.log" Apr 17 17:43:04.047825 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.047806 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/util/0.log" Apr 17 17:43:04.067619 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.067562 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ft8cb_8f2fdaaa-222d-4cf4-aff4-3c58136512b3/pull/0.log" Apr 17 17:43:04.471723 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.471700 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-btkxs_f5ed6e24-4094-4171-8cf1-63db13dfd3eb/manager/0.log" Apr 17 17:43:04.645151 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.645123 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-gczp7_3f370589-98a2-4131-bbf8-5dff6b31af90/manager/0.log" Apr 17 17:43:04.756891 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:04.756815 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-nqghx_ae6e8b6f-740c-4964-b8b7-e9cc7e48c340/manager/0.log" Apr 17 17:43:06.393682 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:06.393654 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-88j49_d7991c94-ae1a-4579-9bde-b10b5d113e64/cluster-monitoring-operator/0.log" Apr 17 17:43:06.621735 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:06.621709 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jxtms_e4762e69-9a43-41b3-9bb8-2a302e94867a/node-exporter/0.log" Apr 17 17:43:06.639388 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:06.639361 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jxtms_e4762e69-9a43-41b3-9bb8-2a302e94867a/kube-rbac-proxy/0.log" Apr 17 17:43:06.658003 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:06.657940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jxtms_e4762e69-9a43-41b3-9bb8-2a302e94867a/init-textfile/0.log" Apr 17 17:43:07.004085 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:07.004057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-28tp8_c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5/prometheus-operator/0.log" Apr 17 17:43:07.026620 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:07.026597 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-28tp8_c548aaa0-7fdb-46f1-8ebb-9f5a8edd1ee5/kube-rbac-proxy/0.log" Apr 17 17:43:07.053354 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:07.053331 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-wsbz9_cd8e3ca4-bd8f-42be-8464-9caa8f36f300/prometheus-operator-admission-webhook/0.log" Apr 17 17:43:08.198367 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.198340 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg"] Apr 17 17:43:08.202512 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.202487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.205328 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.204989 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"kube-root-ca.crt\"" Apr 17 17:43:08.205328 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.205037 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"openshift-service-ca.crt\"" Apr 17 17:43:08.205665 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.205643 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ssms\"/\"default-dockercfg-l4kdb\"" Apr 17 17:43:08.207981 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.207961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg"] Apr 17 17:43:08.256398 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.256376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qm5w4_5b54101b-e6bf-474e-95bf-7a3894ef0486/networking-console-plugin/0.log" Apr 17 17:43:08.280754 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.280731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-proc\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.280857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.280764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-lib-modules\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.280857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.280780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-podres\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.280857 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.280797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7m72\" (UniqueName: \"kubernetes.io/projected/85e365fa-9ef9-4930-a70a-cf1ad782802a-kube-api-access-p7m72\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.281018 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.280920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-sys\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382249 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-sys\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382351 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-proc\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382351 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-sys\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382351 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-lib-modules\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382351 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-podres\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382351 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7m72\" (UniqueName: \"kubernetes.io/projected/85e365fa-9ef9-4930-a70a-cf1ad782802a-kube-api-access-p7m72\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382554 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-proc\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382554 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-podres\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.382554 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.382475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85e365fa-9ef9-4930-a70a-cf1ad782802a-lib-modules\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.390595 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.390575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7m72\" (UniqueName: \"kubernetes.io/projected/85e365fa-9ef9-4930-a70a-cf1ad782802a-kube-api-access-p7m72\") pod \"perf-node-gather-daemonset-p52gg\" (UID: \"85e365fa-9ef9-4930-a70a-cf1ad782802a\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.514105 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.514037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:08.639454 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.639427 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg"] Apr 17 17:43:08.640916 ip-10-0-138-224 kubenswrapper[2572]: W0417 17:43:08.640888 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod85e365fa_9ef9_4930_a70a_cf1ad782802a.slice/crio-4240401597b94f7f6074aa049cb29045b3701f16b07002f3e88e421f845ac325 WatchSource:0}: Error finding container 4240401597b94f7f6074aa049cb29045b3701f16b07002f3e88e421f845ac325: Status 404 returned error can't find the container with id 4240401597b94f7f6074aa049cb29045b3701f16b07002f3e88e421f845ac325 Apr 17 17:43:08.642929 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:08.642914 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:43:09.257490 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.257462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85c88ddcb8-f5pfv_1551c43c-26fe-414a-a5cc-4073a15ede45/console/0.log" Apr 17 17:43:09.406986 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.406951 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" event={"ID":"85e365fa-9ef9-4930-a70a-cf1ad782802a","Type":"ContainerStarted","Data":"bd6c02dab774dda5d15297b0f934d84af1313ff823f4497f34c342bd1060108a"} Apr 17 17:43:09.406986 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.406987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" event={"ID":"85e365fa-9ef9-4930-a70a-cf1ad782802a","Type":"ContainerStarted","Data":"4240401597b94f7f6074aa049cb29045b3701f16b07002f3e88e421f845ac325"} Apr 17 17:43:09.407191 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.407071 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:09.424552 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.424512 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" podStartSLOduration=1.424500079 podStartE2EDuration="1.424500079s" podCreationTimestamp="2026-04-17 17:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:09.421802556 +0000 UTC m=+2117.089486211" watchObservedRunningTime="2026-04-17 17:43:09.424500079 +0000 UTC m=+2117.092183733" Apr 17 17:43:09.745097 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:09.745067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7gpcz_507cdf5e-4dc4-4544-aa29-50b8e78da951/volume-data-source-validator/0.log" Apr 17 17:43:10.501942 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:10.501898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77gh9_05633129-a6c5-4b2a-9ddc-4a376e6b79c3/dns/0.log" Apr 17 17:43:10.520875 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:10.520836 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77gh9_05633129-a6c5-4b2a-9ddc-4a376e6b79c3/kube-rbac-proxy/0.log" Apr 17 17:43:10.643707 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:10.643680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rxccf_c3fbb6b8-715e-4512-b7ce-584ff3fdf72e/dns-node-resolver/0.log" Apr 17 17:43:11.085492 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:11.085463 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-555cf688b8-smdbn_9cad22a0-bc57-4643-ab1b-ce70d19c1c47/registry/0.log" Apr 17 17:43:11.156708 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:11.156680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vbgg6_e2ff083b-5e25-4ad5-9ebe-7d015658c212/node-ca/0.log" Apr 17 17:43:12.113919 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:12.113890 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gsx6s_e161db9b-4d23-41e2-9c33-155fcf18d401/discovery/0.log" Apr 17 17:43:12.178533 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:12.178511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-db5457dbf-xgnb5_7190378e-cdc3-4581-b7a0-d93b9cd31af8/kube-auth-proxy/0.log" Apr 17 17:43:12.303537 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:12.303507 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b86f88f8-qvj8p_394bfa3f-360e-4dbb-ab25-846a90b23983/router/0.log" Apr 17 17:43:12.807261 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:12.807227 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tjgfv_8cd72e65-31b5-4a4a-acf8-3800bb1d5898/serve-healthcheck-canary/0.log" Apr 17 17:43:13.376831 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:13.376783 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tghn4_81b4269c-846c-46c3-9c36-0aa0083de609/kube-rbac-proxy/0.log" Apr 17 17:43:13.396123 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:13.396092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tghn4_81b4269c-846c-46c3-9c36-0aa0083de609/exporter/0.log" Apr 17 17:43:13.416950 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:13.416926 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tghn4_81b4269c-846c-46c3-9c36-0aa0083de609/extractor/0.log" Apr 17 17:43:15.315969 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.315934 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zgh7t_a52b22a7-ef79-4ea3-9766-bb80d6394b58/manager/0.log" Apr 17 17:43:15.359092 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.359066 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-77cd854497-p8kqx_459cb84c-3e80-4510-a443-f0518e1a38e8/maas-api/0.log" Apr 17 17:43:15.422907 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.422883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-p52gg" Apr 17 17:43:15.439600 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.439575 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-84979b9f54-dcn74_c138494f-f8d0-407a-8b45-48872c23d5c8/manager/0.log" Apr 17 17:43:15.459616 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.459582 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-tqt9s_543f91a3-6c9c-4a1b-837f-670ccec0ff35/manager/0.log" Apr 17 17:43:15.481331 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.481309 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-tqt9s_543f91a3-6c9c-4a1b-837f-670ccec0ff35/manager/1.log" Apr 17 17:43:15.502981 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:15.502956 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-7qtnw_af5524b1-ba23-4941-96f9-faddbb864aa7/manager/0.log" Apr 17 17:43:21.385905 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:21.385835 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4wfsm_03d537b0-8e43-49ac-aaf8-dc6d4576a650/kube-storage-version-migrator-operator/1.log" Apr 17 17:43:21.388521 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:21.388471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4wfsm_03d537b0-8e43-49ac-aaf8-dc6d4576a650/kube-storage-version-migrator-operator/0.log" Apr 17 17:43:22.685354 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.685326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/kube-multus-additional-cni-plugins/0.log" Apr 17 17:43:22.713993 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.713964 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/egress-router-binary-copy/0.log" Apr 17 17:43:22.737987 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.737963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/cni-plugins/0.log" Apr 17 17:43:22.758043 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.758022 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/bond-cni-plugin/0.log" Apr 17 17:43:22.777298 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.777280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/routeoverride-cni/0.log" Apr 17 17:43:22.795991 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.795969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/whereabouts-cni-bincopy/0.log" Apr 17 17:43:22.815413 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.815392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kn4sv_e3e9c005-8254-4300-8c36-63018e536c0f/whereabouts-cni/0.log" Apr 17 17:43:22.881081 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:22.881055 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cjws9_3feca2fd-76f4-4d80-9641-209f7a166211/kube-multus/0.log" Apr 17 17:43:23.023068 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.022991 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h6fpt_072c5e3f-6547-42c7-8e8e-c517d7283183/network-metrics-daemon/0.log" Apr 17 17:43:23.040253 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.040229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h6fpt_072c5e3f-6547-42c7-8e8e-c517d7283183/kube-rbac-proxy/0.log" Apr 17 17:43:23.878374 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.878282 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-controller/0.log" Apr 17 17:43:23.895872 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.895847 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/0.log" Apr 17 17:43:23.914861 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.914834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovn-acl-logging/1.log" Apr 17 17:43:23.939313 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.939287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/kube-rbac-proxy-node/0.log" Apr 17 17:43:23.959604 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.959581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:43:23.978505 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.978488 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/northd/0.log" Apr 17 17:43:23.997951 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:23.997922 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/nbdb/0.log" Apr 17 17:43:24.021995 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:24.021970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/sbdb/0.log" Apr 17 17:43:24.185477 ip-10-0-138-224 kubenswrapper[2572]: I0417 17:43:24.185452 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ltqc_990242e6-25ef-4749-8d89-b0083d90c418/ovnkube-controller/0.log"