Apr 24 23:51:15.167565 ip-10-0-132-64 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:51:15.167578 ip-10-0-132-64 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:51:15.167587 ip-10-0-132-64 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:51:15.167889 ip-10-0-132-64 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:25.266133 ip-10-0-132-64 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:25.266151 ip-10-0-132-64 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4ba585686308492e8fec0d0fea0744d8 -- Apr 24 23:53:58.723980 ip-10-0-132-64 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:59.155506 ip-10-0-132-64 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:59.155506 ip-10-0-132-64 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:59.155506 ip-10-0-132-64 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:59.155506 ip-10-0-132-64 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:59.155506 ip-10-0-132-64 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:59.157101 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.156982 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:59.159202 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159187 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:59.159202 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159202 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159206 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159209 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159212 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159215 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159217 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159220 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159224 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159228 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159231 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159233 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159236 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159239 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159241 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159244 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159246 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159249 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159251 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159254 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:59.159298 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159263 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159266 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159269 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159271 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159274 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159278 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159280 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159283 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159286 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159288 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159290 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159293 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159295 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159298 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159302 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159306 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159309 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159313 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159316 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159319 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:59.159792 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159322 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159325 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159328 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159331 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159333 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159336 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159338 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159340 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159343 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159345 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159347 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159350 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159352 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159354 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159358 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159360 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159363 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159365 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159368 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159370 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:59.160305 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159373 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159376 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159378 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159380 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159384 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159387 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159389 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159391 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159394 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159396 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159399 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159401 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159403 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159406 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159408 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159411 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159414 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159416 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159418 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159421 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:59.160848 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159423 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159426 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159428 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159431 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159433 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159436 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159849 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159856 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159859 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159862 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159865 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159868 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159870 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159873 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159875 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159878 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159881 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159883 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159886 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159888 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:59.161369 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159891 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159893 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159896 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159899 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159901 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159904 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159906 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159909 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159912 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159914 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159917 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159919 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159921 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159924 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159927 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159930 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159932 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159935 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159938 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159941 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:59.161909 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159943 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159945 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159948 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159951 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159953 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159956 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159958 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159962 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159965 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159968 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159971 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159973 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159975 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159978 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159980 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159983 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159985 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159987 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159990 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159992 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:59.162442 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159995 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.159998 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160001 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160003 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160005 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160008 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160010 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160013 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160015 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160018 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160020 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160023 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160025 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160028 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160031 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160033 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160035 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160040 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160043 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:59.162984 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160046 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160049 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160052 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160055 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160057 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160060 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160062 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160065 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160067 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160070 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160072 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160075 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160077 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160149 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160156 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160162 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160167 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160171 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160174 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160179 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160183 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:59.163506 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160186 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160190 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160193 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160197 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160200 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160202 2576 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160205 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160208 2576 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160211 2576 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160214 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160217 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160222 2576 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160225 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160228 2576 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160230 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160234 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160238 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160241 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160244 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160247 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160250 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160253 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160255 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160258 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160261 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:59.164066 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160265 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160269 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160272 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160275 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160278 2576 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160281 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160285 2576 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160288 2576 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160291 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160294 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160297 2576 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160301 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160304 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160306 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160310 2576 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160313 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160316 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160319 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160322 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160324 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160327 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160330 2576 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160334 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160339 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160342 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:59.164716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160345 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160348 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160351 2576 flags.go:64] FLAG: --help="false" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160354 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160357 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160360 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160363 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160366 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160369 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160372 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160375 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160378 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160380 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160383 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160386 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160389 2576 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160392 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160395 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160397 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160400 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160403 2576 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160405 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160408 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160411 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:59.165416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160416 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160419 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160422 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160424 2576 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160428 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160431 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160435 2576 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160439 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160443 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160446 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160450 2576 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160453 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160456 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160459 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160461 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160464 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160467 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160469 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160477 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160480 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160483 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160486 2576 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160489 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160494 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:59.166133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160497 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160500 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160503 2576 flags.go:64] FLAG: --port="10250" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160505 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160508 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cecdedc5d8712c7a" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160511 2576 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160514 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160517 2576 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160520 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160522 2576 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160526 2576 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160529 2576 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160531 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160534 2576 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160539 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160541 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160545 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160548 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160551 2576 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160553 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160556 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160559 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160562 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160565 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160568 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160570 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:59.166840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160573 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160576 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160579 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160582 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160584 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160587 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160590 2576 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160593 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160597 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160600 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160603 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160607 2576 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160610 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160612 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160615 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160618 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160621 2576 flags.go:64] FLAG: --v="2" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160625 2576 flags.go:64] FLAG: --version="false" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160629 2576 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160633 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.160637 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160746 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160753 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160757 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160760 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:59.167512 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160763 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160765 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160768 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160770 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160773 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160775 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160778 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160780 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160783 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160786 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160788 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160791 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160794 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160796 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160799 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160801 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160804 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160807 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160809 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160812 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:59.168175 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160814 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160816 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160819 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160821 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160824 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160826 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160828 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160833 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160835 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160839 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160842 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160844 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160847 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160849 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160852 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160854 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160857 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160859 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160862 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:59.168744 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160864 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160867 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160870 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160873 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160875 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160878 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160880 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160882 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160886 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160889 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160892 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160895 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160898 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160901 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160903 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160906 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160909 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160911 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160913 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:59.169248 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160916 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160920 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160923 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160928 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160930 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160933 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160936 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160940 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160943 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160945 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160947 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160950 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160952 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160955 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160957 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160959 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160962 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160964 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160967 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160969 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:59.169784 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160972 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160974 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160977 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.160979 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.161842 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.168808 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.168823 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168871 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168875 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168879 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168883 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168888 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168891 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168893 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168896 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:59.170342 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168899 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168902 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168904 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168908 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168912 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168915 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168917 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168920 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168938 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168942 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168945 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168947 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168950 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168953 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168956 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168958 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168961 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168964 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168967 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168969 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:59.170766 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168972 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168974 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168977 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168980 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168983 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168985 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168988 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168990 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168993 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168996 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.168998 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169001 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169003 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169006 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169009 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169011 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169014 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169016 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169018 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169021 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:59.171292 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169023 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169026 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169028 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169031 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169033 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169035 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169038 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169041 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169043 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169046 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169049 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169051 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169054 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169056 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169059 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169062 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169064 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169067 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169070 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169072 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:59.171829 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169075 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169077 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169079 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169082 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169084 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169087 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169089 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169092 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169094 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169096 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169099 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169101 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169103 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169106 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169109 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169111 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169113 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:59.172360 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169115 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.169120 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169218 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169224 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169227 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169230 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169233 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169236 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169238 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169241 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169244 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169247 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169251 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169255 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169257 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:59.172820 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169260 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169263 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169266 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169268 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169271 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169274 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169276 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169278 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169281 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169283 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169286 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169288 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169301 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169304 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169306 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169309 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169311 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169314 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169318 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169322 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:59.173218 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169325 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169328 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169331 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169333 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169336 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169339 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169342 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169344 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169347 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169350 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169353 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169355 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169358 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169360 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169363 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169365 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169368 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169370 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169373 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169375 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:59.173888 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169378 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169380 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169383 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169385 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169387 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169390 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169392 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169394 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169397 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169399 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169402 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169404 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169407 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169409 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169412 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169415 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169417 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169419 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169422 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169424 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:59.174439 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169427 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169429 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169432 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169434 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169436 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169439 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169441 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169443 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169446 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169448 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169451 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169453 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:53:59.169455 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.169460 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.170163 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:59.175041 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.174519 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:59.175612 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.175598 2576 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:59.175747 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.175724 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:59.175795 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.175774 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:59.201420 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.201396 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:59.203770 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.203747 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:59.218393 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.218369 2576 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:59.223514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.223498 2576 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:59.224677 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.224651 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:59.229177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.229155 2576 fs.go:135] Filesystem UUIDs: map[57501e04-b5be-4331-85be-1e1c7dbab998:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 96648baf-b9e8-49ee-a253-d71673bd5615:/dev/nvme0n1p3] Apr 24 23:53:59.229247 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.229176 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:59.230606 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.230587 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:59.234943 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.234820 2576 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:59.23298149 +0000 UTC m=+0.393931122 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3079120 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27d0a462f61c4e33dfcfdc70285a4b SystemUUID:ec27d0a4-62f6-1c4e-33df-cfdc70285a4b BootID:4ba58568-6308-492e-8fec-0d0fea0744d8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:38:a0:bd:01:b5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:38:a0:bd:01:b5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:5d:dd:e2:f4:db Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:59.234943 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.234938 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:59.235051 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.235019 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:59.236982 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.236958 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:59.237127 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.236986 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-64.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:59.237179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.237137 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:59.237179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.237146 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:59.237179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.237159 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:59.238086 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.238075 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:59.239354 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.239344 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:59.239470 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.239461 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:59.241835 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.241824 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:59.241871 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.241845 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:59.241871 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.241858 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:59.241871 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.241867 2576 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:59.241974 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.241876 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:59.242989 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.242974 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:59.243103 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.243003 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:59.245901 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.245885 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:59.247189 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.247174 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:59.248973 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.248961 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.248978 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.248984 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.248991 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.248996 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249003 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249008 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249014 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249021 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249026 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:59.249035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249035 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:59.249323 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.249043 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:59.250426 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.250408 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nmrxx" Apr 24 23:53:59.250482 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.250471 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:59.250521 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.250486 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:59.251951 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.251933 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-64.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:59.252381 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.252364 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:59.252563 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.252549 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-64.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:59.254028 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.254015 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:59.254063 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.254051 2576 server.go:1295] "Started kubelet" Apr 24 23:53:59.254161 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.254121 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:59.254850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.254788 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:59.254945 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.254876 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:59.255334 ip-10-0-132-64 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:59.256511 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.255989 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:59.257778 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.257710 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:59.262424 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.262398 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nmrxx" Apr 24 23:53:59.264569 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.264547 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:59.264569 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.264559 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:59.265496 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.265403 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.266217 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266196 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:59.266217 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266220 2576 factory.go:55] Registering systemd factory Apr 24 23:53:59.266375 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.266223 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:59.266375 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266230 2576 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:59.266375 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266245 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:59.266509 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266498 2576 factory.go:153] Registering CRI-O factory Apr 24 23:53:59.266554 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266511 2576 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:59.266554 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266528 2576 factory.go:103] Registering Raw factory Apr 24 23:53:59.266554 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266541 2576 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:59.266764 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266746 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:59.266822 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266766 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:59.266902 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266886 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:59.266949 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.266903 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:59.267161 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.267149 2576 manager.go:319] Starting recovery of all containers Apr 24 23:53:59.273636 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.273615 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:59.274724 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.274707 2576 manager.go:324] Recovery completed Apr 24 23:53:59.276399 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.276380 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-64.ec2.internal\" not found" node="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.279134 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.279121 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.281255 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281242 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.281318 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281270 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.281318 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281282 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.281760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281746 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:59.281760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281760 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:59.281849 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.281775 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:59.284904 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.284892 2576 policy_none.go:49] "None policy: Start" Apr 24 23:53:59.284951 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.284908 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:59.284951 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.284918 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:59.325639 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.325409 2576 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.325661 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.325675 2576 server.go:85] "Starting device plugin registration server" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.325925 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.325935 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.326042 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.326146 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.326155 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.326751 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:59.346314 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.326786 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.393490 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.393455 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:59.394889 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.394872 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:59.394997 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.394896 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:59.394997 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.394923 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:59.394997 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.394933 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:59.394997 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.394975 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:59.399063 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.399045 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:59.426174 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.426133 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.426970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.426957 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.427032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.426985 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.427032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.426995 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.427032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.427017 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.435364 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.435351 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.435421 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.435373 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-64.ec2.internal\": node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.465427 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.465409 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.495839 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.495818 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal"] Apr 24 23:53:59.495890 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.495882 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.497184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.497164 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.497282 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.497192 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.497282 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.497202 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.499302 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499291 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.499671 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.499671 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499496 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.499964 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499949 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.500044 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499951 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.500044 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.500000 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.500044 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.500010 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.500044 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.499980 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.500184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.500055 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.502780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.502764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.502855 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.502790 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:59.503401 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.503386 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:59.503488 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.503416 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:59.503488 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.503428 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:59.524325 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.524300 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-64.ec2.internal\" not found" node="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.528877 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.528861 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-64.ec2.internal\" not found" node="ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.566017 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.565997 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.666581 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.666557 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.668770 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.668752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.668823 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.668779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.668823 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.668794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7a770aaafd6135ccc34734060ee4e87-config\") pod \"kube-apiserver-proxy-ip-10-0-132-64.ec2.internal\" (UID: \"d7a770aaafd6135ccc34734060ee4e87\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.767239 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.767215 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.769416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.769461 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.769461 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7a770aaafd6135ccc34734060ee4e87-config\") pod \"kube-apiserver-proxy-ip-10-0-132-64.ec2.internal\" (UID: \"d7a770aaafd6135ccc34734060ee4e87\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.769524 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7a770aaafd6135ccc34734060ee4e87-config\") pod \"kube-apiserver-proxy-ip-10-0-132-64.ec2.internal\" (UID: \"d7a770aaafd6135ccc34734060ee4e87\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.769524 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.769591 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.769496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18a686d43836b89472c2a3a8bbb55e45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal\" (UID: \"18a686d43836b89472c2a3a8bbb55e45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.826555 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.826525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.831026 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:53:59.831010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:53:59.868040 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.868002 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:53:59.968519 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:53:59.968491 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-64.ec2.internal\" not found" Apr 24 23:54:00.064709 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.064611 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:00.065574 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.065556 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" Apr 24 23:54:00.078678 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.078653 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:54:00.080573 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.080560 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" Apr 24 23:54:00.089012 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.088996 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:54:00.175468 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.175446 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:54:00.175952 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.175581 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:54:00.175952 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.175606 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:54:00.175952 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.175611 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:54:00.242357 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.242331 2576 apiserver.go:52] "Watching apiserver" Apr 24 23:54:00.242475 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.242439 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:00.248028 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.248004 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:54:00.248372 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.248353 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mj7ls","openshift-cluster-node-tuning-operator/tuned-zgs5z","openshift-dns/node-resolver-gfw9h","openshift-image-registry/node-ca-cm667","openshift-multus/multus-4ql4n","openshift-network-operator/iptables-alerter-5ddrw","kube-system/konnectivity-agent-5xdns","kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal","openshift-multus/multus-additional-cni-plugins-kqqcb","openshift-multus/network-metrics-daemon-wrw7v","openshift-network-diagnostics/network-check-target-p279k"] Apr 24 23:54:00.253676 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.253659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.255791 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.255769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.255983 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.255967 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:54:00.256067 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.255992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-w78bb\"" Apr 24 23:54:00.256067 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.255976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:54:00.257573 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.257538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.257670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.257601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8j9vs\"" Apr 24 23:54:00.257757 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.257730 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.257951 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.257934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.259724 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.259708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.259813 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.259707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qc7t9\"" Apr 24 23:54:00.259875 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.259811 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.260084 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.260070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.261793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.261778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:54:00.261867 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.261809 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.262008 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.261996 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.262120 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.262095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rgfxr\"" Apr 24 23:54:00.262242 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.262227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.262313 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.262302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.264009 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.263994 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.264179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264166 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gfckq\"" Apr 24 23:54:00.264245 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264186 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.264300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264233 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:59 +0000 UTC" deadline="2027-11-09 21:03:40.029552418 +0000 UTC" Apr 24 23:54:00.264300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264269 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13533h9m39.765286089s" Apr 24 23:54:00.264405 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264390 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:54:00.264529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.264619 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264603 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:54:00.264686 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.264686 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:54:00.264832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264710 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pcwkl\"" Apr 24 23:54:00.264832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.264832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.264609 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:54:00.266531 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.266508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vpkzx\"" Apr 24 23:54:00.266663 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.266593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.266663 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.266606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:54:00.266891 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.266876 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.267318 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.267303 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:54:00.267434 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.267419 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:54:00.267481 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.267445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:54:00.268164 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.268146 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.268773 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.268752 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:54:00.268860 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.268797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rx27f\"" Apr 24 23:54:00.268860 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.268821 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:54:00.269621 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.269022 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:54:00.269621 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.269185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.270880 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.270864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:54:00.271472 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.271459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.271541 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.271522 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:00.271797 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.271783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wf2j9\"" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/864575cd-867d-4ff1-99fd-72319ad03b97-ovn-node-metrics-cert\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-systemd\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba532e45-f2da-4349-bf2b-680421e6b958-serviceca\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-multus\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/607faeff-0f25-43eb-a633-127b915c9238-agent-certs\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-iptables-alerter-script\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-modprobe-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.272969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-kubernetes\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-ovn\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-device-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-netns\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-etc-kubernetes\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/84d61329-00aa-4270-b9d1-b1f736da6f64-hosts-file\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-env-overrides\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-registration-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-host\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-var-lib-kubelet\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cni-binary-copy\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-node-log\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-kubelet\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysconfig\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cnibin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj4sd\" (UniqueName: \"kubernetes.io/projected/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-kube-api-access-gj4sd\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-socket-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-tmp\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-system-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.273850 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-systemd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-slash\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6x5p\" (UniqueName: \"kubernetes.io/projected/864575cd-867d-4ff1-99fd-72319ad03b97-kube-api-access-l6x5p\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.273965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-lib-modules\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-tuned\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-os-release\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-conf-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-log-socket\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-netd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btd2r\" (UniqueName: \"kubernetes.io/projected/046eb080-6f08-4044-85e6-6e9bf141dac3-kube-api-access-btd2r\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-host-slash\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-systemd-units\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-netns\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-var-lib-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-config\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba532e45-f2da-4349-bf2b-680421e6b958-host\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-kubelet\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.274576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/607faeff-0f25-43eb-a633-127b915c9238-konnectivity-ca\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-etc-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-sys\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9dh\" (UniqueName: \"kubernetes.io/projected/ba532e45-f2da-4349-bf2b-680421e6b958-kube-api-access-bh9dh\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-socket-dir-parent\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-bin\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-k8s-cni-cncf-io\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.274896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-hostroot\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-daemon-config\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwwk\" (UniqueName: \"kubernetes.io/projected/3fb990ac-0afa-4098-9aa0-0178a341f1cc-kube-api-access-zdwwk\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275068 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmln\" (UniqueName: \"kubernetes.io/projected/84d61329-00aa-4270-b9d1-b1f736da6f64-kube-api-access-6gmln\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-conf\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.275151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-run\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.275291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-bin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/84d61329-00aa-4270-b9d1-b1f736da6f64-tmp-dir\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275329 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-script-lib\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-sys-fs\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qr94\" (UniqueName: \"kubernetes.io/projected/2070654b-e1dc-4cd4-8770-c6f66f355061-kube-api-access-9qr94\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.275825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.275567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-multus-certs\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.292365 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.292331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a686d43836b89472c2a3a8bbb55e45.slice/crio-59cb38af23e6af41b71ad661787ba18a9f8d6b8a2d5128aa802a046fc2eee027 WatchSource:0}: Error finding container 59cb38af23e6af41b71ad661787ba18a9f8d6b8a2d5128aa802a046fc2eee027: Status 404 returned error can't find the container with id 59cb38af23e6af41b71ad661787ba18a9f8d6b8a2d5128aa802a046fc2eee027 Apr 24 23:54:00.292985 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.292961 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a770aaafd6135ccc34734060ee4e87.slice/crio-c7de553be5f934d5d96a38d7d2d9b403006dd4d5385b3d27ee2965f5a394fb20 WatchSource:0}: Error finding container c7de553be5f934d5d96a38d7d2d9b403006dd4d5385b3d27ee2965f5a394fb20: Status 404 returned error can't find the container with id c7de553be5f934d5d96a38d7d2d9b403006dd4d5385b3d27ee2965f5a394fb20 Apr 24 23:54:00.297091 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.297073 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:54:00.299798 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.299783 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wln2s" Apr 24 23:54:00.307968 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.307947 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wln2s" Apr 24 23:54:00.367225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.367196 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:54:00.376668 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysconfig\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9x7w\" (UniqueName: \"kubernetes.io/projected/a4df8649-8216-4ed9-b023-a6de8b027cd5-kube-api-access-b9x7w\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cnibin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj4sd\" (UniqueName: \"kubernetes.io/projected/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-kube-api-access-gj4sd\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysconfig\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.376793 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-socket-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-tmp\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-system-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-os-release\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-systemd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-slash\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cnibin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-socket-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-systemd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6x5p\" (UniqueName: \"kubernetes.io/projected/864575cd-867d-4ff1-99fd-72319ad03b97-kube-api-access-l6x5p\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377027 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.376986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-system-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-slash\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-lib-modules\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-tuned\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377183 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-os-release\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-os-release\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377383 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-lib-modules\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.377760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-conf-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cnibin\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-log-socket\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-netd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btd2r\" (UniqueName: \"kubernetes.io/projected/046eb080-6f08-4044-85e6-6e9bf141dac3-kube-api-access-btd2r\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-conf-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-log-socket\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlktn\" (UniqueName: \"kubernetes.io/projected/f788507a-76a8-4714-8f6e-bf17c2e1c40a-kube-api-access-xlktn\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-netd\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.377914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-host-slash\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-systemd-units\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-host-slash\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-netns\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-systemd-units\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-netns\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.377986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-var-lib-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-var-lib-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-config\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba532e45-f2da-4349-bf2b-680421e6b958-host\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-kubelet\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/607faeff-0f25-43eb-a633-127b915c9238-konnectivity-ca\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-kubelet\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-etc-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba532e45-f2da-4349-bf2b-680421e6b958-host\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-sys\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9dh\" (UniqueName: \"kubernetes.io/projected/ba532e45-f2da-4349-bf2b-680421e6b958-kube-api-access-bh9dh\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-etc-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.378291 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-socket-dir-parent\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-bin\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-k8s-cni-cncf-io\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-hostroot\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-daemon-config\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwwk\" (UniqueName: \"kubernetes.io/projected/3fb990ac-0afa-4098-9aa0-0178a341f1cc-kube-api-access-zdwwk\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-sys\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-cni-bin\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-socket-dir-parent\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmln\" (UniqueName: \"kubernetes.io/projected/84d61329-00aa-4270-b9d1-b1f736da6f64-kube-api-access-6gmln\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-hostroot\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-conf\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-k8s-cni-cncf-io\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-run\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-bin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.379162 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-config\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/84d61329-00aa-4270-b9d1-b1f736da6f64-tmp-dir\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-bin\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-script-lib\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-conf\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-run\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-sys-fs\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qr94\" (UniqueName: \"kubernetes.io/projected/2070654b-e1dc-4cd4-8770-c6f66f355061-kube-api-access-9qr94\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-multus-certs\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.378962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/864575cd-867d-4ff1-99fd-72319ad03b97-ovn-node-metrics-cert\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-systemd\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/84d61329-00aa-4270-b9d1-b1f736da6f64-tmp-dir\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.380195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba532e45-f2da-4349-bf2b-680421e6b958-serviceca\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/607faeff-0f25-43eb-a633-127b915c9238-konnectivity-ca\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-daemon-config\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-sysctl-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-multus\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-multus-certs\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-systemd\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-sys-fs\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-ovnkube-script-lib\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-var-lib-cni-multus\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba532e45-f2da-4349-bf2b-680421e6b958-serviceca\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/607faeff-0f25-43eb-a633-127b915c9238-agent-certs\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-iptables-alerter-script\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-modprobe-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-kubernetes\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.379969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-system-cni-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-modprobe-d\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-kubernetes\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-ovn\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-device-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-netns\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-ovn\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-etc-kubernetes\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-device-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/84d61329-00aa-4270-b9d1-b1f736da6f64-hosts-file\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-etc-kubernetes\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-env-overrides\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/84d61329-00aa-4270-b9d1-b1f736da6f64-hosts-file\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-registration-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-host\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-host\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-var-lib-kubelet\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.381780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cni-binary-copy\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-node-log\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-kubelet\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-kubelet\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/046eb080-6f08-4044-85e6-6e9bf141dac3-registration-dir\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-run-openvswitch\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-iptables-alerter-script\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-etc-tuned\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2070654b-e1dc-4cd4-8770-c6f66f355061-var-lib-kubelet\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-node-log\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/864575cd-867d-4ff1-99fd-72319ad03b97-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2070654b-e1dc-4cd4-8770-c6f66f355061-tmp\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.380999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-host-run-netns\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.381056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb990ac-0afa-4098-9aa0-0178a341f1cc-multus-cni-dir\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.381384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/864575cd-867d-4ff1-99fd-72319ad03b97-env-overrides\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.381660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb990ac-0afa-4098-9aa0-0178a341f1cc-cni-binary-copy\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.382244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.381878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/864575cd-867d-4ff1-99fd-72319ad03b97-ovn-node-metrics-cert\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.382767 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.382115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/607faeff-0f25-43eb-a633-127b915c9238-agent-certs\") pod \"konnectivity-agent-5xdns\" (UID: \"607faeff-0f25-43eb-a633-127b915c9238\") " pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.395253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.395215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj4sd\" (UniqueName: \"kubernetes.io/projected/a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2-kube-api-access-gj4sd\") pod \"iptables-alerter-5ddrw\" (UID: \"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2\") " pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.396012 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.395983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btd2r\" (UniqueName: \"kubernetes.io/projected/046eb080-6f08-4044-85e6-6e9bf141dac3-kube-api-access-btd2r\") pod \"aws-ebs-csi-driver-node-xphwb\" (UID: \"046eb080-6f08-4044-85e6-6e9bf141dac3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.396143 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.396121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6x5p\" (UniqueName: \"kubernetes.io/projected/864575cd-867d-4ff1-99fd-72319ad03b97-kube-api-access-l6x5p\") pod \"ovnkube-node-mj7ls\" (UID: \"864575cd-867d-4ff1-99fd-72319ad03b97\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.396216 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.396193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qr94\" (UniqueName: \"kubernetes.io/projected/2070654b-e1dc-4cd4-8770-c6f66f355061-kube-api-access-9qr94\") pod \"tuned-zgs5z\" (UID: \"2070654b-e1dc-4cd4-8770-c6f66f355061\") " pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.396257 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.396226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmln\" (UniqueName: \"kubernetes.io/projected/84d61329-00aa-4270-b9d1-b1f736da6f64-kube-api-access-6gmln\") pod \"node-resolver-gfw9h\" (UID: \"84d61329-00aa-4270-b9d1-b1f736da6f64\") " pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.396360 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.396335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9dh\" (UniqueName: \"kubernetes.io/projected/ba532e45-f2da-4349-bf2b-680421e6b958-kube-api-access-bh9dh\") pod \"node-ca-cm667\" (UID: \"ba532e45-f2da-4349-bf2b-680421e6b958\") " pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.396360 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.396348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwwk\" (UniqueName: \"kubernetes.io/projected/3fb990ac-0afa-4098-9aa0-0178a341f1cc-kube-api-access-zdwwk\") pod \"multus-4ql4n\" (UID: \"3fb990ac-0afa-4098-9aa0-0178a341f1cc\") " pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.397910 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.397870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" event={"ID":"18a686d43836b89472c2a3a8bbb55e45","Type":"ContainerStarted","Data":"59cb38af23e6af41b71ad661787ba18a9f8d6b8a2d5128aa802a046fc2eee027"} Apr 24 23:54:00.398782 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.398763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" event={"ID":"d7a770aaafd6135ccc34734060ee4e87","Type":"ContainerStarted","Data":"c7de553be5f934d5d96a38d7d2d9b403006dd4d5385b3d27ee2965f5a394fb20"} Apr 24 23:54:00.481840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.481816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cnibin\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.481954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.481845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.481954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.481862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlktn\" (UniqueName: \"kubernetes.io/projected/f788507a-76a8-4714-8f6e-bf17c2e1c40a-kube-api-access-xlktn\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.481954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.481895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.481954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.481921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cnibin\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.481982 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.482048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.982017877 +0000 UTC m=+2.142967496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-system-cni-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:00.482146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-system-cni-dir\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9x7w\" (UniqueName: \"kubernetes.io/projected/a4df8649-8216-4ed9-b023-a6de8b027cd5-kube-api-access-b9x7w\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-os-release\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f788507a-76a8-4714-8f6e-bf17c2e1c40a-os-release\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482773 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.482773 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.482629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f788507a-76a8-4714-8f6e-bf17c2e1c40a-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.491815 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.491795 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:00.491815 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.491812 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:00.491970 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.491821 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:00.491970 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.491862 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.991850974 +0000 UTC m=+2.152800592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:00.494429 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.494412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9x7w\" (UniqueName: \"kubernetes.io/projected/a4df8649-8216-4ed9-b023-a6de8b027cd5-kube-api-access-b9x7w\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.494505 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.494427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlktn\" (UniqueName: \"kubernetes.io/projected/f788507a-76a8-4714-8f6e-bf17c2e1c40a-kube-api-access-xlktn\") pod \"multus-additional-cni-plugins-kqqcb\" (UID: \"f788507a-76a8-4714-8f6e-bf17c2e1c40a\") " pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.595714 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.595626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:00.601322 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.601297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" Apr 24 23:54:00.601810 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.601767 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607faeff_0f25_43eb_a633_127b915c9238.slice/crio-6f6d5143899c7f8dd2017abf6fc02549e25e9c4342a0379d5a564f8a3d8c81f7 WatchSource:0}: Error finding container 6f6d5143899c7f8dd2017abf6fc02549e25e9c4342a0379d5a564f8a3d8c81f7: Status 404 returned error can't find the container with id 6f6d5143899c7f8dd2017abf6fc02549e25e9c4342a0379d5a564f8a3d8c81f7 Apr 24 23:54:00.607794 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.607746 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gfw9h" Apr 24 23:54:00.608370 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.608295 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2070654b_e1dc_4cd4_8770_c6f66f355061.slice/crio-379e2b7eee4a237780835fc768a746f6288e5e736a25651cad10da72b4afac43 WatchSource:0}: Error finding container 379e2b7eee4a237780835fc768a746f6288e5e736a25651cad10da72b4afac43: Status 404 returned error can't find the container with id 379e2b7eee4a237780835fc768a746f6288e5e736a25651cad10da72b4afac43 Apr 24 23:54:00.613460 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.613430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm667" Apr 24 23:54:00.614561 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.614538 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d61329_00aa_4270_b9d1_b1f736da6f64.slice/crio-45f1c62b381825c9c2a1b8e1973c43337850b46f75cddd106cb45d4d9dfabf98 WatchSource:0}: Error finding container 45f1c62b381825c9c2a1b8e1973c43337850b46f75cddd106cb45d4d9dfabf98: Status 404 returned error can't find the container with id 45f1c62b381825c9c2a1b8e1973c43337850b46f75cddd106cb45d4d9dfabf98 Apr 24 23:54:00.619234 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.618864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4ql4n" Apr 24 23:54:00.621047 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.621008 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba532e45_f2da_4349_bf2b_680421e6b958.slice/crio-ce73761a19bc086abde30ff33a20d0f48b2046dd04565c128e5adc529b3d5e1e WatchSource:0}: Error finding container ce73761a19bc086abde30ff33a20d0f48b2046dd04565c128e5adc529b3d5e1e: Status 404 returned error can't find the container with id ce73761a19bc086abde30ff33a20d0f48b2046dd04565c128e5adc529b3d5e1e Apr 24 23:54:00.625958 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.625937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ddrw" Apr 24 23:54:00.631231 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.631214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:00.633336 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.633280 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f3e825_c2e5_44d7_9f59_45dc7ea2eba2.slice/crio-6d7b913d4206f4aed29ef2baefeb895066a95f73a9bdb8617020c263f378942c WatchSource:0}: Error finding container 6d7b913d4206f4aed29ef2baefeb895066a95f73a9bdb8617020c263f378942c: Status 404 returned error can't find the container with id 6d7b913d4206f4aed29ef2baefeb895066a95f73a9bdb8617020c263f378942c Apr 24 23:54:00.636890 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.636747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" Apr 24 23:54:00.640029 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.639992 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864575cd_867d_4ff1_99fd_72319ad03b97.slice/crio-5bec1b411447c62312c221f4c42f8ef93971dbde391c6a6c86f0cb75c12440c7 WatchSource:0}: Error finding container 5bec1b411447c62312c221f4c42f8ef93971dbde391c6a6c86f0cb75c12440c7: Status 404 returned error can't find the container with id 5bec1b411447c62312c221f4c42f8ef93971dbde391c6a6c86f0cb75c12440c7 Apr 24 23:54:00.643247 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.643227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" Apr 24 23:54:00.646932 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.646908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046eb080_6f08_4044_85e6_6e9bf141dac3.slice/crio-73242a9f952a6e4bc523aa3b33bd87098f340d949ead9eee5a1d51c638ad0826 WatchSource:0}: Error finding container 73242a9f952a6e4bc523aa3b33bd87098f340d949ead9eee5a1d51c638ad0826: Status 404 returned error can't find the container with id 73242a9f952a6e4bc523aa3b33bd87098f340d949ead9eee5a1d51c638ad0826 Apr 24 23:54:00.653270 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:54:00.653243 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf788507a_76a8_4714_8f6e_bf17c2e1c40a.slice/crio-37a05a9018085411babce04d41f7bb0ad6ee921f44815471e2e25aa1722000b8 WatchSource:0}: Error finding container 37a05a9018085411babce04d41f7bb0ad6ee921f44815471e2e25aa1722000b8: Status 404 returned error can't find the container with id 37a05a9018085411babce04d41f7bb0ad6ee921f44815471e2e25aa1722000b8 Apr 24 23:54:00.985227 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:00.985153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:00.985410 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.985309 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:00.985410 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:00.985371 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.985353337 +0000 UTC m=+3.146302969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:01.085540 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.085501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:01.085740 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.085679 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:01.085740 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.085717 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:01.085740 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.085730 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:01.085900 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.085787 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:02.085767114 +0000 UTC m=+3.246716749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:01.305810 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.305595 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:01.309140 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.309047 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:49:00 +0000 UTC" deadline="2028-01-21 06:25:05.032464334 +0000 UTC" Apr 24 23:54:01.309140 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.309085 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15270h31m3.723383632s" Apr 24 23:54:01.397357 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.396838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:01.397357 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.396959 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:01.446195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.446160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"5bec1b411447c62312c221f4c42f8ef93971dbde391c6a6c86f0cb75c12440c7"} Apr 24 23:54:01.461727 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.461462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4ql4n" event={"ID":"3fb990ac-0afa-4098-9aa0-0178a341f1cc","Type":"ContainerStarted","Data":"cfb931a510ae476d26e407b1ee524b10e6d460b7b4cde65ee445313c42a4f426"} Apr 24 23:54:01.472153 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.472089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm667" event={"ID":"ba532e45-f2da-4349-bf2b-680421e6b958","Type":"ContainerStarted","Data":"ce73761a19bc086abde30ff33a20d0f48b2046dd04565c128e5adc529b3d5e1e"} Apr 24 23:54:01.484024 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.483937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5xdns" event={"ID":"607faeff-0f25-43eb-a633-127b915c9238","Type":"ContainerStarted","Data":"6f6d5143899c7f8dd2017abf6fc02549e25e9c4342a0379d5a564f8a3d8c81f7"} Apr 24 23:54:01.502565 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.502497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ddrw" event={"ID":"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2","Type":"ContainerStarted","Data":"6d7b913d4206f4aed29ef2baefeb895066a95f73a9bdb8617020c263f378942c"} Apr 24 23:54:01.507359 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.507332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gfw9h" event={"ID":"84d61329-00aa-4270-b9d1-b1f736da6f64","Type":"ContainerStarted","Data":"45f1c62b381825c9c2a1b8e1973c43337850b46f75cddd106cb45d4d9dfabf98"} Apr 24 23:54:01.543270 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.543235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" event={"ID":"2070654b-e1dc-4cd4-8770-c6f66f355061","Type":"ContainerStarted","Data":"379e2b7eee4a237780835fc768a746f6288e5e736a25651cad10da72b4afac43"} Apr 24 23:54:01.568324 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.568252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerStarted","Data":"37a05a9018085411babce04d41f7bb0ad6ee921f44815471e2e25aa1722000b8"} Apr 24 23:54:01.574741 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.573792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" event={"ID":"046eb080-6f08-4044-85e6-6e9bf141dac3","Type":"ContainerStarted","Data":"73242a9f952a6e4bc523aa3b33bd87098f340d949ead9eee5a1d51c638ad0826"} Apr 24 23:54:01.628546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.628515 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:01.992138 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:01.992006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:01.992308 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.992143 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:01.992308 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:01.992206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.992185752 +0000 UTC m=+5.153135385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:02.092808 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:02.092773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:02.092965 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:02.092931 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:02.092965 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:02.092950 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:02.092965 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:02.092963 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:02.093126 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:02.093017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:04.093000601 +0000 UTC m=+5.253950236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:02.310114 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:02.310074 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:49:00 +0000 UTC" deadline="2028-02-08 20:55:41.625954194 +0000 UTC" Apr 24 23:54:02.310114 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:02.310112 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15717h1m39.315845799s" Apr 24 23:54:02.347854 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:02.347827 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:02.396272 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:02.396239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:02.396425 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:02.396370 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:03.395968 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:03.395932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:03.396394 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:03.396062 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:04.010591 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:04.010546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:04.010800 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.010778 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:04.010882 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.010846 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.010831419 +0000 UTC m=+9.171781043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:04.111962 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:04.111359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:04.111962 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.111512 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:04.111962 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.111532 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:04.111962 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.111545 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:04.111962 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.111606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.111587869 +0000 UTC m=+9.272537493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:04.395685 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:04.395590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:04.395855 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:04.395751 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:05.395309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:05.395264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:05.395717 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:05.395394 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:06.395839 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:06.395363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:06.395839 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:06.395498 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:07.396873 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:07.396836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:07.397241 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:07.396935 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:08.047806 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:08.047765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:08.048074 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.047946 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:08.048074 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.048026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.048004246 +0000 UTC m=+17.208953870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:08.148917 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:08.148875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:08.149110 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.149088 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:08.149110 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.149106 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:08.149241 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.149118 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:08.149241 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.149178 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.149156995 +0000 UTC m=+17.310106620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:08.395490 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:08.395399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:08.395636 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:08.395541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:09.396317 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:09.396281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:09.396774 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:09.396377 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:10.395535 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:10.395503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:10.395673 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:10.395618 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:11.396019 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:11.395983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:11.396392 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:11.396092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:12.396099 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:12.396057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:12.396566 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:12.396203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:13.395537 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:13.395498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:13.395681 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:13.395631 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:14.395954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:14.395917 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:14.396389 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:14.396054 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:15.395518 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:15.395485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:15.395671 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:15.395608 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:16.102108 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:16.102070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:16.102535 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.102204 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:16.102535 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.102276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:32.102255224 +0000 UTC m=+33.263204845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:16.202508 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:16.202471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:16.202670 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.202616 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:16.202670 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.202635 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:16.202670 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.202646 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:16.202856 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.202728 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:32.202689412 +0000 UTC m=+33.363639037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:16.396207 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:16.396133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:16.396352 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:16.396241 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:17.396405 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:17.396201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:17.396921 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:17.396487 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:18.395091 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:18.395061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:18.395251 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:18.395158 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:19.354882 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:19.354730 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864575cd_867d_4ff1_99fd_72319ad03b97.slice/crio-4530e8d8b60877c55db15347200cebeb1a282da3785d8cea3bf5650e465f702d.scope\": RecentStats: unable to find data in memory cache]" Apr 24 23:54:19.396454 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.396428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:19.396596 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:19.396517 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:19.615573 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.615543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" event={"ID":"2070654b-e1dc-4cd4-8770-c6f66f355061","Type":"ContainerStarted","Data":"16121f3ef799f2d3372fdbfcfe308bd948a88568363db88ccd7e77dbad575ca1"} Apr 24 23:54:19.616847 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.616821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" event={"ID":"d7a770aaafd6135ccc34734060ee4e87","Type":"ContainerStarted","Data":"93529b876cde9ee5da94c90d28f736b04ba14abd9af9b03c4b54f921e941a476"} Apr 24 23:54:19.619135 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:54:19.619504 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619482 2576 generic.go:358] "Generic (PLEG): container finished" podID="864575cd-867d-4ff1-99fd-72319ad03b97" containerID="709ab6c2e08a07603587524a1f5dd71e789bae9b383f620b4add6b242809b198" exitCode=1 Apr 24 23:54:19.619590 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"4530e8d8b60877c55db15347200cebeb1a282da3785d8cea3bf5650e465f702d"} Apr 24 23:54:19.619590 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"6c0aecdf7811099538d22e2f22c80b7199847a98c123ad7ec7d1c9f4a2ec06af"} Apr 24 23:54:19.619590 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"a6530fe1743157e77a061204ba7a24aaf0694bf5dbab4bd4d0904551741bb153"} Apr 24 23:54:19.619723 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"d503328148ad0b4245b4431524d14c9e0c9f8c8b940300abc951793aeefa6296"} Apr 24 23:54:19.619723 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerDied","Data":"709ab6c2e08a07603587524a1f5dd71e789bae9b383f620b4add6b242809b198"} Apr 24 23:54:19.619723 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.619620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"59780e026c58e0e40afae58c157e43aebb9f686fbc881dbfdb7b505e9dbfa5cc"} Apr 24 23:54:19.620724 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.620686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4ql4n" event={"ID":"3fb990ac-0afa-4098-9aa0-0178a341f1cc","Type":"ContainerStarted","Data":"075ee27c945246935ea0a048c048f9a028e65cd049f8842e0adb5f867e826eec"} Apr 24 23:54:19.633024 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.632974 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zgs5z" podStartSLOduration=2.744845267 podStartE2EDuration="20.632960668s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.609686659 +0000 UTC m=+1.770636290" lastFinishedPulling="2026-04-24 23:54:18.497802073 +0000 UTC m=+19.658751691" observedRunningTime="2026-04-24 23:54:19.632492625 +0000 UTC m=+20.793442268" watchObservedRunningTime="2026-04-24 23:54:19.632960668 +0000 UTC m=+20.793910308" Apr 24 23:54:19.648955 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.648910 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4ql4n" podStartSLOduration=2.644929914 podStartE2EDuration="20.648895676s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.62980351 +0000 UTC m=+1.790753136" lastFinishedPulling="2026-04-24 23:54:18.633769278 +0000 UTC m=+19.794718898" observedRunningTime="2026-04-24 23:54:19.648452013 +0000 UTC m=+20.809401650" watchObservedRunningTime="2026-04-24 23:54:19.648895676 +0000 UTC m=+20.809845319" Apr 24 23:54:19.662010 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:19.661967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-64.ec2.internal" podStartSLOduration=19.661952601 podStartE2EDuration="19.661952601s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:19.661946763 +0000 UTC m=+20.822896398" watchObservedRunningTime="2026-04-24 23:54:19.661952601 +0000 UTC m=+20.822902233" Apr 24 23:54:20.395487 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:20.395457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:20.396045 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:20.395581 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:20.625021 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:20.624986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ddrw" event={"ID":"a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2","Type":"ContainerStarted","Data":"e8caea52b43752f0d780d52fbe03a1304f538e5815b773ccb312cccc31660076"} Apr 24 23:54:21.395503 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:21.395472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:21.395961 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:21.395582 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:21.629375 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:21.629347 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:54:21.630797 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:21.630758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"6e8c71c6722b0ccdfcc2332c2511e1a50925778e3604bbc70cbb6b7ba34cb926"} Apr 24 23:54:22.395549 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.395318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:22.395838 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:22.395685 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:22.633764 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.633642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm667" event={"ID":"ba532e45-f2da-4349-bf2b-680421e6b958","Type":"ContainerStarted","Data":"7a7b502e43d70abe3302ebbb0a34fd4cbfdf512fff67f08fc5fa215b48974e46"} Apr 24 23:54:22.634890 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.634870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5xdns" event={"ID":"607faeff-0f25-43eb-a633-127b915c9238","Type":"ContainerStarted","Data":"46c86e16c9dad92126e0f7364b0609912deb2e0e33e508f0a0236974d781b09f"} Apr 24 23:54:22.636150 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.636127 2576 generic.go:358] "Generic (PLEG): container finished" podID="18a686d43836b89472c2a3a8bbb55e45" containerID="335d485c145e4b484ba503cdb8885ecb0df7cdb195f29142783711ef807608c9" exitCode=0 Apr 24 23:54:22.636240 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.636186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" event={"ID":"18a686d43836b89472c2a3a8bbb55e45","Type":"ContainerDied","Data":"335d485c145e4b484ba503cdb8885ecb0df7cdb195f29142783711ef807608c9"} Apr 24 23:54:22.637451 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.637429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gfw9h" event={"ID":"84d61329-00aa-4270-b9d1-b1f736da6f64","Type":"ContainerStarted","Data":"6f42b2d1ce0ab4e67cba51f8ce2498027d0a60c06c90656944ca391f78fb82f8"} Apr 24 23:54:22.638830 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.638803 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="1abbc9770b8406a2fa0332dcf2de86938d0055b7766daf229f181706d5370ef1" exitCode=0 Apr 24 23:54:22.638931 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.638882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"1abbc9770b8406a2fa0332dcf2de86938d0055b7766daf229f181706d5370ef1"} Apr 24 23:54:22.640117 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.640013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" event={"ID":"046eb080-6f08-4044-85e6-6e9bf141dac3","Type":"ContainerStarted","Data":"4198257ed29333b010ee2ed1da12ddfdbb5a4e9096ae0d109446b22501996821"} Apr 24 23:54:22.649195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.649162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cm667" podStartSLOduration=5.774393403 podStartE2EDuration="23.649153548s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.623029287 +0000 UTC m=+1.783978906" lastFinishedPulling="2026-04-24 23:54:18.497789429 +0000 UTC m=+19.658739051" observedRunningTime="2026-04-24 23:54:22.648883776 +0000 UTC m=+23.809833428" watchObservedRunningTime="2026-04-24 23:54:22.649153548 +0000 UTC m=+23.810103189" Apr 24 23:54:22.649313 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.649294 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5ddrw" podStartSLOduration=5.651401607 podStartE2EDuration="23.649289361s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.63527533 +0000 UTC m=+1.796224953" lastFinishedPulling="2026-04-24 23:54:18.633163083 +0000 UTC m=+19.794112707" observedRunningTime="2026-04-24 23:54:20.642031177 +0000 UTC m=+21.802980817" watchObservedRunningTime="2026-04-24 23:54:22.649289361 +0000 UTC m=+23.810239002" Apr 24 23:54:22.663015 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.662976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5xdns" podStartSLOduration=5.6354778979999995 podStartE2EDuration="23.662965721s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.604079302 +0000 UTC m=+1.765028921" lastFinishedPulling="2026-04-24 23:54:18.631567112 +0000 UTC m=+19.792516744" observedRunningTime="2026-04-24 23:54:22.662667995 +0000 UTC m=+23.823617635" watchObservedRunningTime="2026-04-24 23:54:22.662965721 +0000 UTC m=+23.823915362" Apr 24 23:54:22.677199 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:22.677166 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gfw9h" podStartSLOduration=5.662850419 podStartE2EDuration="23.677155239s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.617356045 +0000 UTC m=+1.778305668" lastFinishedPulling="2026-04-24 23:54:18.631660855 +0000 UTC m=+19.792610488" observedRunningTime="2026-04-24 23:54:22.67691872 +0000 UTC m=+23.837868361" watchObservedRunningTime="2026-04-24 23:54:22.677155239 +0000 UTC m=+23.838104879" Apr 24 23:54:23.395800 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.395296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:23.395800 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:23.395420 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:23.560217 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.559924 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:23.560858 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.560832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:23.643235 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.643210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" event={"ID":"18a686d43836b89472c2a3a8bbb55e45","Type":"ContainerStarted","Data":"151bd58e536af3a2e15e861b7c5ce5d65f18d875bb4e39ae3a567326a3fc6742"} Apr 24 23:54:23.643547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.643527 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:23.644003 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.643987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5xdns" Apr 24 23:54:23.671610 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.671592 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:23.677441 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:23.677401 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-64.ec2.internal" podStartSLOduration=23.677378953 podStartE2EDuration="23.677378953s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:23.65845174 +0000 UTC m=+24.819401386" watchObservedRunningTime="2026-04-24 23:54:23.677378953 +0000 UTC m=+24.838328594" Apr 24 23:54:24.341021 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.340920 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:23.671607114Z","UUID":"8a19958a-c91d-42c9-b5c5-3913cbad89bf","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:24.343138 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.343111 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:24.343138 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.343142 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:24.395482 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.395448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:24.395669 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:24.395587 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:24.647231 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.647140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" event={"ID":"046eb080-6f08-4044-85e6-6e9bf141dac3","Type":"ContainerStarted","Data":"dbc95e9d743a55202556b7f6d77ab56367a5b618362c220a3185a8f7cc838d8f"} Apr 24 23:54:24.650457 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.650433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:54:24.650890 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.650866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"3dcc4679e513a07de6475f5ba6e97a043dd88ad025f2e60170585c44eecbe3f2"} Apr 24 23:54:24.651272 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.651249 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:24.651343 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.651287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:24.651343 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.651323 2576 scope.go:117] "RemoveContainer" containerID="709ab6c2e08a07603587524a1f5dd71e789bae9b383f620b4add6b242809b198" Apr 24 23:54:24.668746 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:24.668728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:25.395731 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.395508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:25.395885 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:25.395832 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:25.655509 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.655422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" event={"ID":"046eb080-6f08-4044-85e6-6e9bf141dac3","Type":"ContainerStarted","Data":"cf59bf45997660d6b7b9db59be85d0a9529dd0b4ac777d977ff5301fd8e20764"} Apr 24 23:54:25.659876 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.659854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:54:25.660299 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.660266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" event={"ID":"864575cd-867d-4ff1-99fd-72319ad03b97","Type":"ContainerStarted","Data":"d8eecd7e39611b3872dd1433ba9f0cf1f536209ff92ed543313ce7c6cd4def17"} Apr 24 23:54:25.660792 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.660758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:25.678565 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.678544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:54:25.680078 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.680002 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xphwb" podStartSLOduration=2.410332174 podStartE2EDuration="26.679989635s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.65006393 +0000 UTC m=+1.811013552" lastFinishedPulling="2026-04-24 23:54:24.919721391 +0000 UTC m=+26.080671013" observedRunningTime="2026-04-24 23:54:25.679740245 +0000 UTC m=+26.840689900" watchObservedRunningTime="2026-04-24 23:54:25.679989635 +0000 UTC m=+26.840939275" Apr 24 23:54:25.710327 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.710285 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" podStartSLOduration=8.672901049 podStartE2EDuration="26.710268841s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.643439879 +0000 UTC m=+1.804389502" lastFinishedPulling="2026-04-24 23:54:18.68080767 +0000 UTC m=+19.841757294" observedRunningTime="2026-04-24 23:54:25.709604897 +0000 UTC m=+26.870554537" watchObservedRunningTime="2026-04-24 23:54:25.710268841 +0000 UTC m=+26.871218481" Apr 24 23:54:25.908437 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.908353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wrw7v"] Apr 24 23:54:25.908594 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.908489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:25.908594 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:25.908585 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:25.923312 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.923280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p279k"] Apr 24 23:54:25.923459 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:25.923402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:25.923532 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:25.923495 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:27.395825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:27.395796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:27.396621 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:27.395835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:27.396621 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:27.395927 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:27.396621 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:27.396030 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:27.665346 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:27.665318 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="34881bde31ace958b7cc76926eb24ad9372213f3988e683b1af0f371da73d77d" exitCode=0 Apr 24 23:54:27.665471 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:27.665396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"34881bde31ace958b7cc76926eb24ad9372213f3988e683b1af0f371da73d77d"} Apr 24 23:54:29.396518 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:29.396351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:29.396866 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:29.396413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:29.396866 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:29.396597 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:29.396866 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:29.396653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:29.670721 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:29.670642 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="5c38aa8de4dd0d8e99da74ac8d11d024511d586a3ed5f024ee1c04099cf6d461" exitCode=0 Apr 24 23:54:29.670721 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:29.670714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"5c38aa8de4dd0d8e99da74ac8d11d024511d586a3ed5f024ee1c04099cf6d461"} Apr 24 23:54:31.396032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:31.396002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:31.396404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:31.396007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:31.396404 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:31.396097 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p279k" podUID="0badefdd-5292-410f-94d9-30bdbec0d66d" Apr 24 23:54:31.396404 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:31.396212 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wrw7v" podUID="a4df8649-8216-4ed9-b023-a6de8b027cd5" Apr 24 23:54:31.676581 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:31.676549 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="7f65efb23a7ce67d36b653c83d3b6e6e841f5d2224aafa9c6a170eb348e06861" exitCode=0 Apr 24 23:54:31.676737 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:31.676608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"7f65efb23a7ce67d36b653c83d3b6e6e841f5d2224aafa9c6a170eb348e06861"} Apr 24 23:54:32.126714 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.126674 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-64.ec2.internal" event="NodeReady" Apr 24 23:54:32.126882 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.126840 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:32.142139 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.142110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:32.142284 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.142267 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:32.142346 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.142326 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.14230796 +0000 UTC m=+65.303257583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:32.178067 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.178039 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j4hmb"] Apr 24 23:54:32.197149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.197125 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gb2jv"] Apr 24 23:54:32.197326 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.197299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.199612 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.199588 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:32.199743 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.199641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:32.199743 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.199656 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:54:32.221556 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.221532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4hmb"] Apr 24 23:54:32.221667 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.221568 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gb2jv"] Apr 24 23:54:32.221667 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.221566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.224032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.224009 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:32.224032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.224026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:32.224221 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.224028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:32.224221 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.224152 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:54:32.243483 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.243459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:32.243669 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.243653 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:32.243760 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.243675 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:32.243760 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.243688 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xmpxd for pod openshift-network-diagnostics/network-check-target-p279k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:32.243858 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.243764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd podName:0badefdd-5292-410f-94d9-30bdbec0d66d nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.243746718 +0000 UTC m=+65.404696354 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmpxd" (UniqueName: "kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd") pod "network-check-target-p279k" (UID: "0badefdd-5292-410f-94d9-30bdbec0d66d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:32.344768 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.344928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18a83d5-7d20-4b99-9a28-d4fea36360b1-tmp-dir\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.344928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzzh\" (UniqueName: \"kubernetes.io/projected/c18a83d5-7d20-4b99-9a28-d4fea36360b1-kube-api-access-sbzzh\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.344928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.344928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qpmv\" (UniqueName: \"kubernetes.io/projected/14193e4c-7287-4686-892b-3006e6c02a97-kube-api-access-2qpmv\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.345132 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.344971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18a83d5-7d20-4b99-9a28-d4fea36360b1-config-volume\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.446179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18a83d5-7d20-4b99-9a28-d4fea36360b1-config-volume\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.446179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18a83d5-7d20-4b99-9a28-d4fea36360b1-tmp-dir\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzzh\" (UniqueName: \"kubernetes.io/projected/c18a83d5-7d20-4b99-9a28-d4fea36360b1-kube-api-access-sbzzh\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.446259 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qpmv\" (UniqueName: \"kubernetes.io/projected/14193e4c-7287-4686-892b-3006e6c02a97-kube-api-access-2qpmv\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.446326 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:32.946306071 +0000 UTC m=+34.107255706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.446447 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.446482 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:32.946471918 +0000 UTC m=+34.107421541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:32.446785 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18a83d5-7d20-4b99-9a28-d4fea36360b1-tmp-dir\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.447080 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.446793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18a83d5-7d20-4b99-9a28-d4fea36360b1-config-volume\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.458784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.458763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzzh\" (UniqueName: \"kubernetes.io/projected/c18a83d5-7d20-4b99-9a28-d4fea36360b1-kube-api-access-sbzzh\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.459033 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.459012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qpmv\" (UniqueName: \"kubernetes.io/projected/14193e4c-7287-4686-892b-3006e6c02a97-kube-api-access-2qpmv\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.950798 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.950737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:32.951049 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:32.950863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:32.951049 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.950893 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:32.951049 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.950978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:33.950955921 +0000 UTC m=+35.111905544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:32.951049 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.951030 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:32.951262 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:32.951098 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:33.95108076 +0000 UTC m=+35.112030381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:33.395531 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.395496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:54:33.395531 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.395512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:54:33.399203 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.399180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:33.399343 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.399180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wfsx7\"" Apr 24 23:54:33.399949 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.399928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:33.400082 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.399956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:54:33.400082 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.399934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:33.958471 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.958415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:33.959304 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:33.958508 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:33.959304 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:33.958513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:33.959304 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:33.958585 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:35.958563498 +0000 UTC m=+37.119513131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:33.959304 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:33.958591 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:33.959304 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:33.958644 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:35.958630061 +0000 UTC m=+37.119579693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:35.972853 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:35.972816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:35.973333 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:35.972875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:35.973333 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:35.972967 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:35.973333 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:35.973012 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:39.972998243 +0000 UTC m=+41.133947862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:35.973474 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:35.973407 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:35.973474 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:35.973464 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:39.973449723 +0000 UTC m=+41.134399342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:37.692584 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:37.692493 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="2fdd8d2c1ac73d4280d7dc53b9247c8dccc4caf81ad534085affa47485eec85f" exitCode=0 Apr 24 23:54:37.692584 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:37.692544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"2fdd8d2c1ac73d4280d7dc53b9247c8dccc4caf81ad534085affa47485eec85f"} Apr 24 23:54:38.697120 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:38.697086 2576 generic.go:358] "Generic (PLEG): container finished" podID="f788507a-76a8-4714-8f6e-bf17c2e1c40a" containerID="7df10a909960e1e058ede2cd6a7cae534f56c3da0600d8709dc7a498ca2209de" exitCode=0 Apr 24 23:54:38.697572 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:38.697151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerDied","Data":"7df10a909960e1e058ede2cd6a7cae534f56c3da0600d8709dc7a498ca2209de"} Apr 24 23:54:39.701667 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:39.701634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" event={"ID":"f788507a-76a8-4714-8f6e-bf17c2e1c40a","Type":"ContainerStarted","Data":"b0648c7f9e41bf860375766635b28def83a2365747337dec5cf12ad41eb1c384"} Apr 24 23:54:39.729266 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:39.729197 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kqqcb" podStartSLOduration=4.316305774 podStartE2EDuration="40.729175889s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.655402247 +0000 UTC m=+1.816351878" lastFinishedPulling="2026-04-24 23:54:37.068272374 +0000 UTC m=+38.229221993" observedRunningTime="2026-04-24 23:54:39.725462988 +0000 UTC m=+40.886412630" watchObservedRunningTime="2026-04-24 23:54:39.729175889 +0000 UTC m=+40.890125530" Apr 24 23:54:40.000215 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:40.000170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:40.000381 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:40.000235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:40.000381 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:40.000315 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:40.000381 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:40.000320 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:40.000381 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:40.000365 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:48.000351207 +0000 UTC m=+49.161300847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:40.000526 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:40.000386 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:48.000370326 +0000 UTC m=+49.161319965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:48.052571 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:48.052529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:54:48.053008 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:48.052592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:54:48.053008 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:48.052668 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:48.053008 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:48.052666 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:48.053008 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:48.052742 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.052727486 +0000 UTC m=+65.213677118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:54:48.053008 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:54:48.052756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.05275064 +0000 UTC m=+65.213700262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:54:57.680542 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:54:57.680514 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj7ls" Apr 24 23:55:04.058613 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.058571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:55:04.059135 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.058632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:55:04.059135 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.058740 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:04.059135 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.058770 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:04.059135 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.058806 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:36.058790191 +0000 UTC m=+97.219739811 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:55:04.059135 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.058835 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:36.058818469 +0000 UTC m=+97.219768097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:55:04.159150 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.159117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:55:04.161796 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.161776 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:55:04.169705 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.169677 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:04.169787 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:04.169774 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs podName:a4df8649-8216-4ed9-b023-a6de8b027cd5 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:08.169753671 +0000 UTC m=+129.330703300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs") pod "network-metrics-daemon-wrw7v" (UID: "a4df8649-8216-4ed9-b023-a6de8b027cd5") : secret "metrics-daemon-secret" not found Apr 24 23:55:04.260020 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.259999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:55:04.262948 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.262924 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:55:04.273002 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.272986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:55:04.284235 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.284215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmpxd\" (UniqueName: \"kubernetes.io/projected/0badefdd-5292-410f-94d9-30bdbec0d66d-kube-api-access-xmpxd\") pod \"network-check-target-p279k\" (UID: \"0badefdd-5292-410f-94d9-30bdbec0d66d\") " pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:55:04.311055 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.310999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wfsx7\"" Apr 24 23:55:04.319811 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.319793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:55:04.445543 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.445517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p279k"] Apr 24 23:55:04.448768 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:04.448738 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0badefdd_5292_410f_94d9_30bdbec0d66d.slice/crio-7c82eec18118b940ceae4370611caaee684516f973d3fde7364107c5f193c26b WatchSource:0}: Error finding container 7c82eec18118b940ceae4370611caaee684516f973d3fde7364107c5f193c26b: Status 404 returned error can't find the container with id 7c82eec18118b940ceae4370611caaee684516f973d3fde7364107c5f193c26b Apr 24 23:55:04.749670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:04.749638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p279k" event={"ID":"0badefdd-5292-410f-94d9-30bdbec0d66d","Type":"ContainerStarted","Data":"7c82eec18118b940ceae4370611caaee684516f973d3fde7364107c5f193c26b"} Apr 24 23:55:07.756885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:07.756852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p279k" event={"ID":"0badefdd-5292-410f-94d9-30bdbec0d66d","Type":"ContainerStarted","Data":"15c15a797944a61acaa696561f3fe0a1dbd6e4b58bf97e7487a3090385dd7db2"} Apr 24 23:55:07.757216 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:07.757078 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:55:07.776404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:07.776356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-p279k" podStartSLOduration=66.202266173 podStartE2EDuration="1m8.776344794s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:55:04.450459448 +0000 UTC m=+65.611409067" lastFinishedPulling="2026-04-24 23:55:07.024538067 +0000 UTC m=+68.185487688" observedRunningTime="2026-04-24 23:55:07.775570158 +0000 UTC m=+68.936519798" watchObservedRunningTime="2026-04-24 23:55:07.776344794 +0000 UTC m=+68.937294436" Apr 24 23:55:36.072152 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:36.072086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:55:36.072152 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:36.072143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:55:36.072631 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:36.072223 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:36.072631 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:36.072226 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:36.072631 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:36.072292 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert podName:14193e4c-7287-4686-892b-3006e6c02a97 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:40.07227878 +0000 UTC m=+161.233228399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert") pod "ingress-canary-gb2jv" (UID: "14193e4c-7287-4686-892b-3006e6c02a97") : secret "canary-serving-cert" not found Apr 24 23:55:36.072631 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:36.072305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls podName:c18a83d5-7d20-4b99-9a28-d4fea36360b1 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:40.072298996 +0000 UTC m=+161.233248615 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls") pod "dns-default-j4hmb" (UID: "c18a83d5-7d20-4b99-9a28-d4fea36360b1") : secret "dns-default-metrics-tls" not found Apr 24 23:55:38.761834 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:38.761806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-p279k" Apr 24 23:55:45.580001 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.579962 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4"] Apr 24 23:55:45.586242 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.586215 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bcgjj"] Apr 24 23:55:45.586382 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.586364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.588493 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.588469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-62gwr\"" Apr 24 23:55:45.589542 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.589517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.589675 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.589656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.589748 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.589725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 23:55:45.589822 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.589662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.590944 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.590925 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4"] Apr 24 23:55:45.591684 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.591665 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.592028 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.592006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 23:55:45.592160 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.592050 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 23:55:45.592266 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.592064 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.592571 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.592555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jgmbr\"" Apr 24 23:55:45.597394 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.597376 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 23:55:45.603262 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.603245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bcgjj"] Apr 24 23:55:45.681437 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.681406 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd"] Apr 24 23:55:45.684881 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.684860 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2"] Apr 24 23:55:45.685020 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.685003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" Apr 24 23:55:45.687616 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.687599 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.687821 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.687808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-z8sn2\"" Apr 24 23:55:45.687894 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.687875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.688431 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.688416 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8"] Apr 24 23:55:45.688557 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.688544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.690934 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.690914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-6jjln\"" Apr 24 23:55:45.691053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.690950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.691053 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.690993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 23:55:45.691219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.691204 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.691409 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.691390 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 23:55:45.691647 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.691424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb"] Apr 24 23:55:45.691647 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.691572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.693528 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.693513 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 23:55:45.693832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.693816 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.693832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.693826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.693970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.693862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-rcwlh\"" Apr 24 23:55:45.693970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.693816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 23:55:45.695307 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.695293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.697429 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.697412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 23:55:45.698019 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.698001 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 23:55:45.699728 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.699385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd"] Apr 24 23:55:45.700143 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.700115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zxhwf\"" Apr 24 23:55:45.700388 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.700369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb"] Apr 24 23:55:45.701205 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.701186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8"] Apr 24 23:55:45.713840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.713821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2"] Apr 24 23:55:45.729760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.729678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wq2\" (UniqueName: \"kubernetes.io/projected/1d316ecc-7ca1-4dcd-a561-d363f811198c-kube-api-access-54wq2\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.729849 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.729790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-trusted-ca\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.729849 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.729821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgk5x\" (UniqueName: \"kubernetes.io/projected/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-kube-api-access-qgk5x\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.729953 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.729860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-serving-cert\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.729953 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.729889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-config\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.730184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.730161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.781924 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.781903 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks"] Apr 24 23:55:45.784977 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.784962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" Apr 24 23:55:45.786201 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.786183 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfnrq"] Apr 24 23:55:45.787945 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.787917 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-s2d95\"" Apr 24 23:55:45.790520 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.790499 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:55:45.790676 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.790657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.793321 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.793304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jvmmr\"" Apr 24 23:55:45.793557 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.793544 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.793780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.793763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.793780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.793773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 23:55:45.794594 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.794577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.795239 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.794955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 23:55:45.796897 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.796878 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks"] Apr 24 23:55:45.799204 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.799182 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 23:55:45.800262 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.800237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8xnt\"" Apr 24 23:55:45.800353 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.800301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:55:45.800520 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.800504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:55:45.800732 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.800717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:55:45.805951 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.805930 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:55:45.807564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.807544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:55:45.809254 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.809222 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfnrq"] Apr 24 23:55:45.830470 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nlh\" (UniqueName: \"kubernetes.io/projected/85bb6fee-0df0-467b-85aa-d617cbda12e0-kube-api-access-n7nlh\") pod \"volume-data-source-validator-7c6cbb6c87-99fjd\" (UID: \"85bb6fee-0df0-467b-85aa-d617cbda12e0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" Apr 24 23:55:45.830470 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-config\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.830580 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe581fd0-91fe-46d8-be3f-cc2be31f574f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.830580 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.830580 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e5790d-4147-4de2-8280-8a4d156daee6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.830670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69j2r\" (UniqueName: \"kubernetes.io/projected/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-kube-api-access-69j2r\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.830670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54wq2\" (UniqueName: \"kubernetes.io/projected/1d316ecc-7ca1-4dcd-a561-d363f811198c-kube-api-access-54wq2\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.830670 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:45.830634 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:55:45.830670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-trusted-ca\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.830864 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:45.830714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls podName:1d316ecc-7ca1-4dcd-a561-d363f811198c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.330678877 +0000 UTC m=+107.491628515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7rhm4" (UID: "1d316ecc-7ca1-4dcd-a561-d363f811198c") : secret "samples-operator-tls" not found Apr 24 23:55:45.830864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.830974 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgk5x\" (UniqueName: \"kubernetes.io/projected/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-kube-api-access-qgk5x\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.830974 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e5790d-4147-4de2-8280-8a4d156daee6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.830974 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-serving-cert\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.830974 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.830956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml24h\" (UniqueName: \"kubernetes.io/projected/34e5790d-4147-4de2-8280-8a4d156daee6-kube-api-access-ml24h\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.831149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.831001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-config\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.831149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.831039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.831149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.831113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-config\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.831360 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.831343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-trusted-ca\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.833214 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.833187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-serving-cert\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.841878 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.841858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wq2\" (UniqueName: \"kubernetes.io/projected/1d316ecc-7ca1-4dcd-a561-d363f811198c-kube-api-access-54wq2\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:45.842097 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.842081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgk5x\" (UniqueName: \"kubernetes.io/projected/8ccca75f-9d61-4cbb-bc55-f033f88df8c6-kube-api-access-qgk5x\") pod \"console-operator-9d4b6777b-bcgjj\" (UID: \"8ccca75f-9d61-4cbb-bc55-f033f88df8c6\") " pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.903041 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.903015 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:45.931949 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.931917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx2g\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.931970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.931996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.932166 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932166 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e5790d-4147-4de2-8280-8a4d156daee6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.932166 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.932313 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932313 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nlh\" (UniqueName: \"kubernetes.io/projected/85bb6fee-0df0-467b-85aa-d617cbda12e0-kube-api-access-n7nlh\") pod \"volume-data-source-validator-7c6cbb6c87-99fjd\" (UID: \"85bb6fee-0df0-467b-85aa-d617cbda12e0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" Apr 24 23:55:45.932313 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6s4h\" (UniqueName: \"kubernetes.io/projected/657f2810-9fef-43b7-825e-f2573c428db1-kube-api-access-l6s4h\") pod \"network-check-source-8894fc9bd-9l8ks\" (UID: \"657f2810-9fef-43b7-825e-f2573c428db1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" Apr 24 23:55:45.932313 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:45.932275 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-snapshots\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:45.932350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.432329486 +0000 UTC m=+107.593279109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c69f72-f063-42af-a243-45f740a1ea73-serving-cert\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe581fd0-91fe-46d8-be3f-cc2be31f574f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.932514 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69j2r\" (UniqueName: \"kubernetes.io/projected/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-kube-api-access-69j2r\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-tmp\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e5790d-4147-4de2-8280-8a4d156daee6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e5790d-4147-4de2-8280-8a4d156daee6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fd9k\" (UniqueName: \"kubernetes.io/projected/d5c69f72-f063-42af-a243-45f740a1ea73-kube-api-access-4fd9k\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml24h\" (UniqueName: \"kubernetes.io/projected/34e5790d-4147-4de2-8280-8a4d156daee6-kube-api-access-ml24h\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.932864 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-config\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.933299 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.932890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:45.933299 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.933242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe581fd0-91fe-46d8-be3f-cc2be31f574f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:45.933746 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.933723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-config\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.934937 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.934917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.935026 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.935000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e5790d-4147-4de2-8280-8a4d156daee6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.946100 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.946041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml24h\" (UniqueName: \"kubernetes.io/projected/34e5790d-4147-4de2-8280-8a4d156daee6-kube-api-access-ml24h\") pod \"kube-storage-version-migrator-operator-6769c5d45-gtxb2\" (UID: \"34e5790d-4147-4de2-8280-8a4d156daee6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:45.946813 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.946742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69j2r\" (UniqueName: \"kubernetes.io/projected/bc0a1f9d-aade-4d80-a5b8-fbc8542431a7-kube-api-access-69j2r\") pod \"service-ca-operator-d6fc45fc5-zpjq8\" (UID: \"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:45.949289 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.949264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nlh\" (UniqueName: \"kubernetes.io/projected/85bb6fee-0df0-467b-85aa-d617cbda12e0-kube-api-access-n7nlh\") pod \"volume-data-source-validator-7c6cbb6c87-99fjd\" (UID: \"85bb6fee-0df0-467b-85aa-d617cbda12e0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" Apr 24 23:55:45.994857 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:45.994829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" Apr 24 23:55:46.003204 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.003183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" Apr 24 23:55:46.008795 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.008775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" Apr 24 23:55:46.017861 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.017835 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bcgjj"] Apr 24 23:55:46.028514 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.028468 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ccca75f_9d61_4cbb_bc55_f033f88df8c6.slice/crio-8175f1c4ac7cf12f82a26ccef299fd39c873eecc195f3f780b322a6bf7a8a95c WatchSource:0}: Error finding container 8175f1c4ac7cf12f82a26ccef299fd39c873eecc195f3f780b322a6bf7a8a95c: Status 404 returned error can't find the container with id 8175f1c4ac7cf12f82a26ccef299fd39c873eecc195f3f780b322a6bf7a8a95c Apr 24 23:55:46.033801 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.033717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fd9k\" (UniqueName: \"kubernetes.io/projected/d5c69f72-f063-42af-a243-45f740a1ea73-kube-api-access-4fd9k\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.033801 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.033750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034040 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.033976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034040 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx2g\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6s4h\" (UniqueName: \"kubernetes.io/projected/657f2810-9fef-43b7-825e-f2573c428db1-kube-api-access-l6s4h\") pod \"network-check-source-8894fc9bd-9l8ks\" (UID: \"657f2810-9fef-43b7-825e-f2573c428db1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" Apr 24 23:55:46.034219 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-snapshots\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c69f72-f063-42af-a243-45f740a1ea73-serving-cert\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.034547 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-tmp\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034970 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.034866 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:46.034970 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.034885 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65f5954f47-n9ftv: secret "image-registry-tls" not found Apr 24 23:55:46.034970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.034922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-tmp\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.034970 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.034938 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls podName:5c5fbc58-8768-4fe2-80b6-18689310ec18 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.53491862 +0000 UTC m=+107.695868247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls") pod "image-registry-65f5954f47-n9ftv" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18") : secret "image-registry-tls" not found Apr 24 23:55:46.035290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.035015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d5c69f72-f063-42af-a243-45f740a1ea73-snapshots\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.035401 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.035382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.035540 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.035520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.036427 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.035904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.036427 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.036166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5c69f72-f063-42af-a243-45f740a1ea73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.036427 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.036366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.038324 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.038304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.039462 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.039442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c69f72-f063-42af-a243-45f740a1ea73-serving-cert\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.040129 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.040055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.041508 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.041462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fd9k\" (UniqueName: \"kubernetes.io/projected/d5c69f72-f063-42af-a243-45f740a1ea73-kube-api-access-4fd9k\") pod \"insights-operator-585dfdc468-sfnrq\" (UID: \"d5c69f72-f063-42af-a243-45f740a1ea73\") " pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.041721 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.041683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx2g\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.041721 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.041711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.048288 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.047904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6s4h\" (UniqueName: \"kubernetes.io/projected/657f2810-9fef-43b7-825e-f2573c428db1-kube-api-access-l6s4h\") pod \"network-check-source-8894fc9bd-9l8ks\" (UID: \"657f2810-9fef-43b7-825e-f2573c428db1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" Apr 24 23:55:46.099038 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.099006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" Apr 24 23:55:46.106320 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.105943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" Apr 24 23:55:46.142925 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.142768 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd"] Apr 24 23:55:46.146010 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.145976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bb6fee_0df0_467b_85aa_d617cbda12e0.slice/crio-2870be68b10f7ccba278ad0a84ef05c491c3507da5d99616b834e60387a1810a WatchSource:0}: Error finding container 2870be68b10f7ccba278ad0a84ef05c491c3507da5d99616b834e60387a1810a: Status 404 returned error can't find the container with id 2870be68b10f7ccba278ad0a84ef05c491c3507da5d99616b834e60387a1810a Apr 24 23:55:46.236762 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.236730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks"] Apr 24 23:55:46.239710 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.239665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657f2810_9fef_43b7_825e_f2573c428db1.slice/crio-130b133097b3ed9d2ea61dc15b7e1935433fa997274cf0ce1487eac55636bcf7 WatchSource:0}: Error finding container 130b133097b3ed9d2ea61dc15b7e1935433fa997274cf0ce1487eac55636bcf7: Status 404 returned error can't find the container with id 130b133097b3ed9d2ea61dc15b7e1935433fa997274cf0ce1487eac55636bcf7 Apr 24 23:55:46.254584 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.254560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfnrq"] Apr 24 23:55:46.256992 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.256968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c69f72_f063_42af_a243_45f740a1ea73.slice/crio-90f129679ea50f142b1b58b728d14116e646c181ce6a42738d0de75bde2d7fba WatchSource:0}: Error finding container 90f129679ea50f142b1b58b728d14116e646c181ce6a42738d0de75bde2d7fba: Status 404 returned error can't find the container with id 90f129679ea50f142b1b58b728d14116e646c181ce6a42738d0de75bde2d7fba Apr 24 23:55:46.337759 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.337677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:46.338105 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.338080 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:55:46.338313 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.338300 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls podName:1d316ecc-7ca1-4dcd-a561-d363f811198c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.338276837 +0000 UTC m=+108.499226471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7rhm4" (UID: "1d316ecc-7ca1-4dcd-a561-d363f811198c") : secret "samples-operator-tls" not found Apr 24 23:55:46.363532 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.363500 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8"] Apr 24 23:55:46.365996 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.365970 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0a1f9d_aade_4d80_a5b8_fbc8542431a7.slice/crio-d1ccc80631084cc16dde7a7eb716c051e4c9b11a2e62fd2fcfd80f34684ba441 WatchSource:0}: Error finding container d1ccc80631084cc16dde7a7eb716c051e4c9b11a2e62fd2fcfd80f34684ba441: Status 404 returned error can't find the container with id d1ccc80631084cc16dde7a7eb716c051e4c9b11a2e62fd2fcfd80f34684ba441 Apr 24 23:55:46.368933 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.368903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2"] Apr 24 23:55:46.372249 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:46.372227 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e5790d_4147_4de2_8280_8a4d156daee6.slice/crio-53167422416ae6ad034894c2545220bcf700287cc72d403df14e092c11f55a2d WatchSource:0}: Error finding container 53167422416ae6ad034894c2545220bcf700287cc72d403df14e092c11f55a2d: Status 404 returned error can't find the container with id 53167422416ae6ad034894c2545220bcf700287cc72d403df14e092c11f55a2d Apr 24 23:55:46.439246 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.439205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:46.439400 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.439353 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:46.439463 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.439433 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.439415341 +0000 UTC m=+108.600364966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:55:46.539795 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.539754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:46.539972 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.539933 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:46.539972 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.539955 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65f5954f47-n9ftv: secret "image-registry-tls" not found Apr 24 23:55:46.540081 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:46.540023 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls podName:5c5fbc58-8768-4fe2-80b6-18689310ec18 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.540001329 +0000 UTC m=+108.700950947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls") pod "image-registry-65f5954f47-n9ftv" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18") : secret "image-registry-tls" not found Apr 24 23:55:46.831664 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.831608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" event={"ID":"d5c69f72-f063-42af-a243-45f740a1ea73","Type":"ContainerStarted","Data":"90f129679ea50f142b1b58b728d14116e646c181ce6a42738d0de75bde2d7fba"} Apr 24 23:55:46.848724 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.848275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" event={"ID":"657f2810-9fef-43b7-825e-f2573c428db1","Type":"ContainerStarted","Data":"dc3ce16c5a80529c4a2a6d37eb49d00ae9760134babaffb0068c3b1197e1efd4"} Apr 24 23:55:46.848724 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.848311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" event={"ID":"657f2810-9fef-43b7-825e-f2573c428db1","Type":"ContainerStarted","Data":"130b133097b3ed9d2ea61dc15b7e1935433fa997274cf0ce1487eac55636bcf7"} Apr 24 23:55:46.852543 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.851789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" event={"ID":"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7","Type":"ContainerStarted","Data":"d1ccc80631084cc16dde7a7eb716c051e4c9b11a2e62fd2fcfd80f34684ba441"} Apr 24 23:55:46.855739 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.855715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" event={"ID":"8ccca75f-9d61-4cbb-bc55-f033f88df8c6","Type":"ContainerStarted","Data":"8175f1c4ac7cf12f82a26ccef299fd39c873eecc195f3f780b322a6bf7a8a95c"} Apr 24 23:55:46.857259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.857234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" event={"ID":"34e5790d-4147-4de2-8280-8a4d156daee6","Type":"ContainerStarted","Data":"53167422416ae6ad034894c2545220bcf700287cc72d403df14e092c11f55a2d"} Apr 24 23:55:46.858623 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.858570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" event={"ID":"85bb6fee-0df0-467b-85aa-d617cbda12e0","Type":"ContainerStarted","Data":"2870be68b10f7ccba278ad0a84ef05c491c3507da5d99616b834e60387a1810a"} Apr 24 23:55:46.864015 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:46.863069 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9l8ks" podStartSLOduration=1.863055004 podStartE2EDuration="1.863055004s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:55:46.862100204 +0000 UTC m=+108.023049845" watchObservedRunningTime="2026-04-24 23:55:46.863055004 +0000 UTC m=+108.024004647" Apr 24 23:55:47.347217 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:47.347162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:47.347668 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.347403 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:55:47.347668 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.347469 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls podName:1d316ecc-7ca1-4dcd-a561-d363f811198c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.347450531 +0000 UTC m=+110.508400152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7rhm4" (UID: "1d316ecc-7ca1-4dcd-a561-d363f811198c") : secret "samples-operator-tls" not found Apr 24 23:55:47.448666 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:47.448630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:47.448852 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.448811 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:47.448908 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.448860 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.448846917 +0000 UTC m=+110.609796536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:55:47.549413 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:47.549375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:47.550142 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.549665 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:47.550142 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.549715 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65f5954f47-n9ftv: secret "image-registry-tls" not found Apr 24 23:55:47.550142 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:47.549782 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls podName:5c5fbc58-8768-4fe2-80b6-18689310ec18 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.549760146 +0000 UTC m=+110.710709777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls") pod "image-registry-65f5954f47-n9ftv" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18") : secret "image-registry-tls" not found Apr 24 23:55:49.366107 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:49.366074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:49.366478 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.366239 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:55:49.366478 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.366317 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls podName:1d316ecc-7ca1-4dcd-a561-d363f811198c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.366297147 +0000 UTC m=+114.527246785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7rhm4" (UID: "1d316ecc-7ca1-4dcd-a561-d363f811198c") : secret "samples-operator-tls" not found Apr 24 23:55:49.467106 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:49.467059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:49.467293 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.467211 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:49.467293 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.467276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.467256952 +0000 UTC m=+114.628206584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:55:49.567588 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:49.567549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:49.567756 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.567685 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:49.567803 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.567756 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65f5954f47-n9ftv: secret "image-registry-tls" not found Apr 24 23:55:49.567852 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:49.567819 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls podName:5c5fbc58-8768-4fe2-80b6-18689310ec18 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.567803692 +0000 UTC m=+114.728753311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls") pod "image-registry-65f5954f47-n9ftv" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18") : secret "image-registry-tls" not found Apr 24 23:55:50.869347 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.869309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" event={"ID":"d5c69f72-f063-42af-a243-45f740a1ea73","Type":"ContainerStarted","Data":"d9c570de223238ef64867607586f8d8def2d02a3c00fba774b626416c373f0b3"} Apr 24 23:55:50.870726 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.870678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" event={"ID":"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7","Type":"ContainerStarted","Data":"22776efa65079d6a9bb87ef6ab06271083e1c9ea172b87fb38975f512bda1e1c"} Apr 24 23:55:50.872462 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.872440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/0.log" Apr 24 23:55:50.872564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.872477 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ccca75f-9d61-4cbb-bc55-f033f88df8c6" containerID="c76816d9d2d523a0cb54a17eca48bcee277a97065c540f98924a3d8a0c604ac9" exitCode=255 Apr 24 23:55:50.872564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.872546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" event={"ID":"8ccca75f-9d61-4cbb-bc55-f033f88df8c6","Type":"ContainerDied","Data":"c76816d9d2d523a0cb54a17eca48bcee277a97065c540f98924a3d8a0c604ac9"} Apr 24 23:55:50.872856 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.872837 2576 scope.go:117] "RemoveContainer" containerID="c76816d9d2d523a0cb54a17eca48bcee277a97065c540f98924a3d8a0c604ac9" Apr 24 23:55:50.874445 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.874420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" event={"ID":"34e5790d-4147-4de2-8280-8a4d156daee6","Type":"ContainerStarted","Data":"6620fd58674c33a5f8180d845e73a9ad771612aba2d483909b5200494c6997b5"} Apr 24 23:55:50.876163 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.876134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" event={"ID":"85bb6fee-0df0-467b-85aa-d617cbda12e0","Type":"ContainerStarted","Data":"db63def9636c7e13bcd0abcc6b5e6bcee327cdf13baa2327df3faddf4a72e04e"} Apr 24 23:55:50.885005 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.884957 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" podStartSLOduration=2.069744429 podStartE2EDuration="5.884942038s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:55:46.259145378 +0000 UTC m=+107.420094998" lastFinishedPulling="2026-04-24 23:55:50.074342985 +0000 UTC m=+111.235292607" observedRunningTime="2026-04-24 23:55:50.883191883 +0000 UTC m=+112.044141540" watchObservedRunningTime="2026-04-24 23:55:50.884942038 +0000 UTC m=+112.045891681" Apr 24 23:55:50.912372 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.912231 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" podStartSLOduration=2.208913011 podStartE2EDuration="5.91221426s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:55:46.374357853 +0000 UTC m=+107.535307472" lastFinishedPulling="2026-04-24 23:55:50.077659099 +0000 UTC m=+111.238608721" observedRunningTime="2026-04-24 23:55:50.912054195 +0000 UTC m=+112.073003837" watchObservedRunningTime="2026-04-24 23:55:50.91221426 +0000 UTC m=+112.073163903" Apr 24 23:55:50.945119 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.945064 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-99fjd" podStartSLOduration=2.018933738 podStartE2EDuration="5.945045738s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:55:46.14791254 +0000 UTC m=+107.308862159" lastFinishedPulling="2026-04-24 23:55:50.074024534 +0000 UTC m=+111.234974159" observedRunningTime="2026-04-24 23:55:50.92615583 +0000 UTC m=+112.087105476" watchObservedRunningTime="2026-04-24 23:55:50.945045738 +0000 UTC m=+112.105995383" Apr 24 23:55:50.945256 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:50.945208 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" podStartSLOduration=2.234118438 podStartE2EDuration="5.945203438s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:55:46.367980446 +0000 UTC m=+107.528930070" lastFinishedPulling="2026-04-24 23:55:50.07906545 +0000 UTC m=+111.240015070" observedRunningTime="2026-04-24 23:55:50.943062611 +0000 UTC m=+112.104012254" watchObservedRunningTime="2026-04-24 23:55:50.945203438 +0000 UTC m=+112.106153080" Apr 24 23:55:51.191056 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.190981 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59"] Apr 24 23:55:51.193970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.193954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" Apr 24 23:55:51.196067 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.196036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5mzpb\"" Apr 24 23:55:51.196177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.196074 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 23:55:51.196177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.196074 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 23:55:51.204315 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.204296 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59"] Apr 24 23:55:51.281430 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.281395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jth\" (UniqueName: \"kubernetes.io/projected/14e1f6a5-afdd-48c3-8639-9d32f9e1b10b-kube-api-access-q2jth\") pod \"migrator-74bb7799d9-66k59\" (UID: \"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" Apr 24 23:55:51.382298 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.382258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jth\" (UniqueName: \"kubernetes.io/projected/14e1f6a5-afdd-48c3-8639-9d32f9e1b10b-kube-api-access-q2jth\") pod \"migrator-74bb7799d9-66k59\" (UID: \"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" Apr 24 23:55:51.389844 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.389812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jth\" (UniqueName: \"kubernetes.io/projected/14e1f6a5-afdd-48c3-8639-9d32f9e1b10b-kube-api-access-q2jth\") pod \"migrator-74bb7799d9-66k59\" (UID: \"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" Apr 24 23:55:51.502817 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.502789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" Apr 24 23:55:51.621779 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.621747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59"] Apr 24 23:55:51.624504 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:55:51.624477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e1f6a5_afdd_48c3_8639_9d32f9e1b10b.slice/crio-9e0ed79da1a36aae9b01bfda54e130e21a3921f6c5bb77f0edbc3e4304acd536 WatchSource:0}: Error finding container 9e0ed79da1a36aae9b01bfda54e130e21a3921f6c5bb77f0edbc3e4304acd536: Status 404 returned error can't find the container with id 9e0ed79da1a36aae9b01bfda54e130e21a3921f6c5bb77f0edbc3e4304acd536 Apr 24 23:55:51.880433 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.880363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 24 23:55:51.880840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.880748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/0.log" Apr 24 23:55:51.880840 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.880783 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ccca75f-9d61-4cbb-bc55-f033f88df8c6" containerID="f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24" exitCode=255 Apr 24 23:55:51.880928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.880868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" event={"ID":"8ccca75f-9d61-4cbb-bc55-f033f88df8c6","Type":"ContainerDied","Data":"f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24"} Apr 24 23:55:51.880928 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.880909 2576 scope.go:117] "RemoveContainer" containerID="c76816d9d2d523a0cb54a17eca48bcee277a97065c540f98924a3d8a0c604ac9" Apr 24 23:55:51.881166 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.881147 2576 scope.go:117] "RemoveContainer" containerID="f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24" Apr 24 23:55:51.881396 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:51.881358 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bcgjj_openshift-console-operator(8ccca75f-9d61-4cbb-bc55-f033f88df8c6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" podUID="8ccca75f-9d61-4cbb-bc55-f033f88df8c6" Apr 24 23:55:51.882125 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:51.882087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" event={"ID":"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b","Type":"ContainerStarted","Data":"9e0ed79da1a36aae9b01bfda54e130e21a3921f6c5bb77f0edbc3e4304acd536"} Apr 24 23:55:52.886177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:52.886085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" event={"ID":"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b","Type":"ContainerStarted","Data":"3412d2a3498e5a1f73673a639df69e1f8736c0df890cddcefcac40e52edb3c66"} Apr 24 23:55:52.886177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:52.886129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" event={"ID":"14e1f6a5-afdd-48c3-8639-9d32f9e1b10b","Type":"ContainerStarted","Data":"bc4fe230674cd34b709f81bbc8667909a2603bd53b9c59f325a25b1d22e9001a"} Apr 24 23:55:52.887453 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:52.887433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 24 23:55:52.887734 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:52.887720 2576 scope.go:117] "RemoveContainer" containerID="f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24" Apr 24 23:55:52.887891 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:52.887875 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bcgjj_openshift-console-operator(8ccca75f-9d61-4cbb-bc55-f033f88df8c6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" podUID="8ccca75f-9d61-4cbb-bc55-f033f88df8c6" Apr 24 23:55:52.900682 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:52.900629 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-66k59" podStartSLOduration=0.904181976 podStartE2EDuration="1.900612657s" podCreationTimestamp="2026-04-24 23:55:51 +0000 UTC" firstStartedPulling="2026-04-24 23:55:51.626393654 +0000 UTC m=+112.787343277" lastFinishedPulling="2026-04-24 23:55:52.622824326 +0000 UTC m=+113.783773958" observedRunningTime="2026-04-24 23:55:52.899200691 +0000 UTC m=+114.060150333" watchObservedRunningTime="2026-04-24 23:55:52.900612657 +0000 UTC m=+114.061562299" Apr 24 23:55:53.401250 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:53.401217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:55:53.401426 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.401321 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:55:53.401426 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.401378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls podName:1d316ecc-7ca1-4dcd-a561-d363f811198c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.401355843 +0000 UTC m=+122.562305462 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7rhm4" (UID: "1d316ecc-7ca1-4dcd-a561-d363f811198c") : secret "samples-operator-tls" not found Apr 24 23:55:53.502265 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:53.502238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:55:53.502400 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.502384 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:55:53.502462 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.502452 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.502430327 +0000 UTC m=+122.663379964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:55:53.602983 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:53.602946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:55:53.603124 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.603106 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:55:53.603159 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.603128 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65f5954f47-n9ftv: secret "image-registry-tls" not found Apr 24 23:55:53.603208 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:53.603187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls podName:5c5fbc58-8768-4fe2-80b6-18689310ec18 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.603171634 +0000 UTC m=+122.764121257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls") pod "image-registry-65f5954f47-n9ftv" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18") : secret "image-registry-tls" not found Apr 24 23:55:54.051365 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:54.051339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gfw9h_84d61329-00aa-4270-b9d1-b1f736da6f64/dns-node-resolver/0.log" Apr 24 23:55:55.050445 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:55.050416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cm667_ba532e45-f2da-4349-bf2b-680421e6b958/node-ca/0.log" Apr 24 23:55:55.903548 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:55.903521 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:55.903548 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:55.903550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:55:55.903979 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:55.903927 2576 scope.go:117] "RemoveContainer" containerID="f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24" Apr 24 23:55:55.904116 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:55:55.904097 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bcgjj_openshift-console-operator(8ccca75f-9d61-4cbb-bc55-f033f88df8c6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" podUID="8ccca75f-9d61-4cbb-bc55-f033f88df8c6" Apr 24 23:55:56.251413 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:56.251383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-66k59_14e1f6a5-afdd-48c3-8639-9d32f9e1b10b/migrator/0.log" Apr 24 23:55:56.450500 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:56.450475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-66k59_14e1f6a5-afdd-48c3-8639-9d32f9e1b10b/graceful-termination/0.log" Apr 24 23:55:56.654616 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:55:56.654537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gtxb2_34e5790d-4147-4de2-8280-8a4d156daee6/kube-storage-version-migrator-operator/0.log" Apr 24 23:56:01.465814 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.465771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:56:01.468297 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.468268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d316ecc-7ca1-4dcd-a561-d363f811198c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7rhm4\" (UID: \"1d316ecc-7ca1-4dcd-a561-d363f811198c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:56:01.500405 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.500378 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-62gwr\"" Apr 24 23:56:01.509211 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.509195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" Apr 24 23:56:01.567221 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.567188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:56:01.567573 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:01.567549 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:56:01.567687 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:01.567674 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert podName:fe581fd0-91fe-46d8-be3f-cc2be31f574f nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.567650432 +0000 UTC m=+138.728600051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sb9nb" (UID: "fe581fd0-91fe-46d8-be3f-cc2be31f574f") : secret "networking-console-plugin-cert" not found Apr 24 23:56:01.628229 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.628195 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4"] Apr 24 23:56:01.668603 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.668575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:01.671156 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.671135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"image-registry-65f5954f47-n9ftv\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:01.714115 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.714089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8xnt\"" Apr 24 23:56:01.722992 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.722942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:01.840200 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.840160 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:56:01.843263 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:01.843239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c5fbc58_8768_4fe2_80b6_18689310ec18.slice/crio-649ec64f1f3b09be039c5d7df03a7c22d82abc4246e666aef796e382d92e42cb WatchSource:0}: Error finding container 649ec64f1f3b09be039c5d7df03a7c22d82abc4246e666aef796e382d92e42cb: Status 404 returned error can't find the container with id 649ec64f1f3b09be039c5d7df03a7c22d82abc4246e666aef796e382d92e42cb Apr 24 23:56:01.913290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.913255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" event={"ID":"5c5fbc58-8768-4fe2-80b6-18689310ec18","Type":"ContainerStarted","Data":"878bf82dbcee335b8d439e575d91f80ea3bcb5ab92375f7b82ea9e6ad9c2606a"} Apr 24 23:56:01.913393 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.913296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" event={"ID":"5c5fbc58-8768-4fe2-80b6-18689310ec18","Type":"ContainerStarted","Data":"649ec64f1f3b09be039c5d7df03a7c22d82abc4246e666aef796e382d92e42cb"} Apr 24 23:56:01.913464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.913447 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:01.914238 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.914213 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" event={"ID":"1d316ecc-7ca1-4dcd-a561-d363f811198c","Type":"ContainerStarted","Data":"6f5c3c86d0fd7d06a14f8f78e0b96942fa77c1d89da661fbcfbf1f3c1280e290"} Apr 24 23:56:01.931115 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:01.931076 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" podStartSLOduration=16.931065048 podStartE2EDuration="16.931065048s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:01.93022541 +0000 UTC m=+123.091175050" watchObservedRunningTime="2026-04-24 23:56:01.931065048 +0000 UTC m=+123.092014688" Apr 24 23:56:03.920774 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:03.920738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" event={"ID":"1d316ecc-7ca1-4dcd-a561-d363f811198c","Type":"ContainerStarted","Data":"8858427f6e0903101ed564ddf30dd2a48170d88419677df9d862f40fd23853a8"} Apr 24 23:56:03.920774 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:03.920776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" event={"ID":"1d316ecc-7ca1-4dcd-a561-d363f811198c","Type":"ContainerStarted","Data":"e14e671636d9fc5fd79aaaddaad40ca74de89a43d7b3ad4a98be44424a0f2722"} Apr 24 23:56:03.936048 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:03.936004 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7rhm4" podStartSLOduration=17.411783402 podStartE2EDuration="18.935992417s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:56:01.684387871 +0000 UTC m=+122.845337489" lastFinishedPulling="2026-04-24 23:56:03.208596881 +0000 UTC m=+124.369546504" observedRunningTime="2026-04-24 23:56:03.935830068 +0000 UTC m=+125.096779708" watchObservedRunningTime="2026-04-24 23:56:03.935992417 +0000 UTC m=+125.096942058" Apr 24 23:56:08.230170 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.230130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:56:08.232567 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.232543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4df8649-8216-4ed9-b023-a6de8b027cd5-metrics-certs\") pod \"network-metrics-daemon-wrw7v\" (UID: \"a4df8649-8216-4ed9-b023-a6de8b027cd5\") " pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:56:08.516284 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.516242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:56:08.525050 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.525023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wrw7v" Apr 24 23:56:08.641914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.641883 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wrw7v"] Apr 24 23:56:08.644638 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:08.644608 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4df8649_8216_4ed9_b023_a6de8b027cd5.slice/crio-de0b88b8554ec9e2e7fd8a0fd4e84b94f369f0ed69b34110431b8e73296901c3 WatchSource:0}: Error finding container de0b88b8554ec9e2e7fd8a0fd4e84b94f369f0ed69b34110431b8e73296901c3: Status 404 returned error can't find the container with id de0b88b8554ec9e2e7fd8a0fd4e84b94f369f0ed69b34110431b8e73296901c3 Apr 24 23:56:08.934558 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:08.934473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wrw7v" event={"ID":"a4df8649-8216-4ed9-b023-a6de8b027cd5","Type":"ContainerStarted","Data":"de0b88b8554ec9e2e7fd8a0fd4e84b94f369f0ed69b34110431b8e73296901c3"} Apr 24 23:56:09.402996 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.399034 2576 scope.go:117] "RemoveContainer" containerID="f508da639ea484fda9d8fcddb1bcbd9296facd897b6bb02ba7edff5dfbbaff24" Apr 24 23:56:09.938758 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.938722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wrw7v" event={"ID":"a4df8649-8216-4ed9-b023-a6de8b027cd5","Type":"ContainerStarted","Data":"ed450e5d016835633aaa5c51c896b3b9db43b20e8a72997016ddc94415d74231"} Apr 24 23:56:09.938900 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.938765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wrw7v" event={"ID":"a4df8649-8216-4ed9-b023-a6de8b027cd5","Type":"ContainerStarted","Data":"b1338a8ec532e6bb48b21c799cc66d96cd89a8995b5129dd663cda5734ca6277"} Apr 24 23:56:09.940319 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.940298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 24 23:56:09.940423 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.940347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" event={"ID":"8ccca75f-9d61-4cbb-bc55-f033f88df8c6","Type":"ContainerStarted","Data":"dafe21f9e3a3870f8e2860788802ae070adde4bafc613d0487b1813c5cb1eab6"} Apr 24 23:56:09.940603 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.940587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:56:09.956475 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.956430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wrw7v" podStartSLOduration=129.969804911 podStartE2EDuration="2m10.956415865s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:56:08.646481143 +0000 UTC m=+129.807430762" lastFinishedPulling="2026-04-24 23:56:09.633092094 +0000 UTC m=+130.794041716" observedRunningTime="2026-04-24 23:56:09.955777584 +0000 UTC m=+131.116727225" watchObservedRunningTime="2026-04-24 23:56:09.956415865 +0000 UTC m=+131.117365508" Apr 24 23:56:09.974625 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:09.974542 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" podStartSLOduration=20.931225783 podStartE2EDuration="24.974532699s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:55:46.030718957 +0000 UTC m=+107.191668577" lastFinishedPulling="2026-04-24 23:55:50.074025866 +0000 UTC m=+111.234975493" observedRunningTime="2026-04-24 23:56:09.973562831 +0000 UTC m=+131.134512471" watchObservedRunningTime="2026-04-24 23:56:09.974532699 +0000 UTC m=+131.135482342" Apr 24 23:56:10.045763 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:10.045736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bcgjj" Apr 24 23:56:16.512941 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.512906 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:56:16.517146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.517114 2576 patch_prober.go:28] interesting pod/image-registry-65f5954f47-n9ftv container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:56:16.517283 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.517173 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:56:16.633515 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.633470 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-w5st9"] Apr 24 23:56:16.636971 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.636954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.639651 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.639634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rp52n\"" Apr 24 23:56:16.639751 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.639682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.641004 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.640981 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.653605 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.653582 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w5st9"] Apr 24 23:56:16.696130 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.696102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/086f85ea-a11f-451b-94e1-ff8da489f053-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.696243 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.696174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/086f85ea-a11f-451b-94e1-ff8da489f053-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.696324 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.696244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvdxt\" (UniqueName: \"kubernetes.io/projected/086f85ea-a11f-451b-94e1-ff8da489f053-kube-api-access-wvdxt\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.696324 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.696279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/086f85ea-a11f-451b-94e1-ff8da489f053-crio-socket\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.696324 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.696302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/086f85ea-a11f-451b-94e1-ff8da489f053-data-volume\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797492 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/086f85ea-a11f-451b-94e1-ff8da489f053-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797492 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/086f85ea-a11f-451b-94e1-ff8da489f053-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797642 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvdxt\" (UniqueName: \"kubernetes.io/projected/086f85ea-a11f-451b-94e1-ff8da489f053-kube-api-access-wvdxt\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797642 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/086f85ea-a11f-451b-94e1-ff8da489f053-crio-socket\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797642 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/086f85ea-a11f-451b-94e1-ff8da489f053-data-volume\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797642 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/086f85ea-a11f-451b-94e1-ff8da489f053-crio-socket\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.797970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.797948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/086f85ea-a11f-451b-94e1-ff8da489f053-data-volume\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.798056 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.798041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/086f85ea-a11f-451b-94e1-ff8da489f053-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.800085 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.800066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/086f85ea-a11f-451b-94e1-ff8da489f053-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.815534 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.815504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvdxt\" (UniqueName: \"kubernetes.io/projected/086f85ea-a11f-451b-94e1-ff8da489f053-kube-api-access-wvdxt\") pod \"insights-runtime-extractor-w5st9\" (UID: \"086f85ea-a11f-451b-94e1-ff8da489f053\") " pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:16.946295 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:16.946250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w5st9" Apr 24 23:56:17.093283 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.093257 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w5st9"] Apr 24 23:56:17.095160 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:17.095133 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086f85ea_a11f_451b_94e1_ff8da489f053.slice/crio-00bfba95a7287eb719dbfae259691c23f094289049a2e33f594ab6d856f16bbc WatchSource:0}: Error finding container 00bfba95a7287eb719dbfae259691c23f094289049a2e33f594ab6d856f16bbc: Status 404 returned error can't find the container with id 00bfba95a7287eb719dbfae259691c23f094289049a2e33f594ab6d856f16bbc Apr 24 23:56:17.604447 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.604399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:56:17.607310 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.607278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fe581fd0-91fe-46d8-be3f-cc2be31f574f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sb9nb\" (UID: \"fe581fd0-91fe-46d8-be3f-cc2be31f574f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:56:17.816980 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.816952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zxhwf\"" Apr 24 23:56:17.825760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.825735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" Apr 24 23:56:17.946842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.946811 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb"] Apr 24 23:56:17.950785 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:17.950760 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe581fd0_91fe_46d8_be3f_cc2be31f574f.slice/crio-0e145f965715808338e242be411b7f65c5b17aae548c5945e106515fef2af4cb WatchSource:0}: Error finding container 0e145f965715808338e242be411b7f65c5b17aae548c5945e106515fef2af4cb: Status 404 returned error can't find the container with id 0e145f965715808338e242be411b7f65c5b17aae548c5945e106515fef2af4cb Apr 24 23:56:17.963014 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.962988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" event={"ID":"fe581fd0-91fe-46d8-be3f-cc2be31f574f","Type":"ContainerStarted","Data":"0e145f965715808338e242be411b7f65c5b17aae548c5945e106515fef2af4cb"} Apr 24 23:56:17.964454 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.964431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w5st9" event={"ID":"086f85ea-a11f-451b-94e1-ff8da489f053","Type":"ContainerStarted","Data":"38fb8c51fcaf798bff245aec9079c3485702743fb6ecbdcf09d71f7e4abd3000"} Apr 24 23:56:17.964530 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.964461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w5st9" event={"ID":"086f85ea-a11f-451b-94e1-ff8da489f053","Type":"ContainerStarted","Data":"0be4edd899c43a200a018d948175ae9a82ba62d6485dbfb8b130c1b048025108"} Apr 24 23:56:17.964530 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:17.964472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w5st9" event={"ID":"086f85ea-a11f-451b-94e1-ff8da489f053","Type":"ContainerStarted","Data":"00bfba95a7287eb719dbfae259691c23f094289049a2e33f594ab6d856f16bbc"} Apr 24 23:56:19.971223 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:19.971188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" event={"ID":"fe581fd0-91fe-46d8-be3f-cc2be31f574f","Type":"ContainerStarted","Data":"0e171507e45971bb21d00ec2bdf77a5c78ba97a6bf53309cdbd88f9bf1cb1bc2"} Apr 24 23:56:19.973133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:19.973106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w5st9" event={"ID":"086f85ea-a11f-451b-94e1-ff8da489f053","Type":"ContainerStarted","Data":"425bf59449e31d06a728a6e5f55f9196a36cd3cf4b5112d35c2e2ed9104f2b03"} Apr 24 23:56:19.987984 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:19.987932 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sb9nb" podStartSLOduration=33.627123608 podStartE2EDuration="34.987917543s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:56:17.952570791 +0000 UTC m=+139.113520413" lastFinishedPulling="2026-04-24 23:56:19.313364728 +0000 UTC m=+140.474314348" observedRunningTime="2026-04-24 23:56:19.987486953 +0000 UTC m=+141.148436618" watchObservedRunningTime="2026-04-24 23:56:19.987917543 +0000 UTC m=+141.148867184" Apr 24 23:56:20.037604 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.037514 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-w5st9" podStartSLOduration=1.881171111 podStartE2EDuration="4.037499579s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:17.156913719 +0000 UTC m=+138.317863339" lastFinishedPulling="2026-04-24 23:56:19.31324218 +0000 UTC m=+140.474191807" observedRunningTime="2026-04-24 23:56:20.037035526 +0000 UTC m=+141.197985169" watchObservedRunningTime="2026-04-24 23:56:20.037499579 +0000 UTC m=+141.198449269" Apr 24 23:56:20.421131 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.421101 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d"] Apr 24 23:56:20.425244 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.425220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:20.427348 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.427323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 23:56:20.427493 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.427473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qdwqt\"" Apr 24 23:56:20.431042 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.431000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d"] Apr 24 23:56:20.524266 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.524230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pcm5d\" (UID: \"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:20.625517 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:20.625483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pcm5d\" (UID: \"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:20.625626 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:20.625602 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 23:56:20.625682 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:20.625668 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates podName:29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b nodeName:}" failed. No retries permitted until 2026-04-24 23:56:21.125640542 +0000 UTC m=+142.286590161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-pcm5d" (UID: "29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 23:56:21.130168 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:21.130132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pcm5d\" (UID: \"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:21.135484 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:21.135453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pcm5d\" (UID: \"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:21.335262 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:21.335220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:21.457537 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:21.457509 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d"] Apr 24 23:56:21.462206 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:21.462178 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f3e42e_6f53_4dd6_b2bd_8b3b379a2d0b.slice/crio-2b47be3b98d11567e4d5d3dd478cfb7d59eceacbed15f79c6dceb5c2692ff487 WatchSource:0}: Error finding container 2b47be3b98d11567e4d5d3dd478cfb7d59eceacbed15f79c6dceb5c2692ff487: Status 404 returned error can't find the container with id 2b47be3b98d11567e4d5d3dd478cfb7d59eceacbed15f79c6dceb5c2692ff487 Apr 24 23:56:21.979121 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:21.979087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" event={"ID":"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b","Type":"ContainerStarted","Data":"2b47be3b98d11567e4d5d3dd478cfb7d59eceacbed15f79c6dceb5c2692ff487"} Apr 24 23:56:22.982797 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:22.982764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" event={"ID":"29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b","Type":"ContainerStarted","Data":"f1fe7f58b0b047494351f32f390abeb9bc2630cd8b69d89e679b97f10d1cbb1a"} Apr 24 23:56:22.983178 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:22.983026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:22.988028 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:22.988002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" Apr 24 23:56:22.999760 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:22.999725 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pcm5d" podStartSLOduration=1.61241773 podStartE2EDuration="2.999712808s" podCreationTimestamp="2026-04-24 23:56:20 +0000 UTC" firstStartedPulling="2026-04-24 23:56:21.464024521 +0000 UTC m=+142.624974139" lastFinishedPulling="2026-04-24 23:56:22.851319581 +0000 UTC m=+144.012269217" observedRunningTime="2026-04-24 23:56:22.998035214 +0000 UTC m=+144.158984854" watchObservedRunningTime="2026-04-24 23:56:22.999712808 +0000 UTC m=+144.160662441" Apr 24 23:56:25.534271 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.534233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:56:25.563002 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.562967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:56:25.563141 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.563012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.565293 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.565269 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:56:25.565440 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.565345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:56:25.565440 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.565405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:56:25.565556 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.565474 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:56:25.566278 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.566260 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2sr25\"" Apr 24 23:56:25.566278 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.566273 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:56:25.566278 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.566279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:56:25.566507 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.566340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:56:25.663644 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.663829 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.663829 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.663934 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.663934 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.663934 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.663914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqf4w\" (UniqueName: \"kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765082 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765082 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765349 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqf4w\" (UniqueName: \"kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765349 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765349 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765515 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.765936 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.765847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.766153 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.766057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.766153 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.766057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.767730 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.767686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.767919 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.767899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.782135 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.782107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqf4w\" (UniqueName: \"kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w\") pod \"console-7cb66f7d6-zc5fm\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.872316 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.872231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:25.998183 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:25.998159 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:56:26.000348 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:26.000324 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07f709e0_ce17_4501_9585_6b8fb8a4b824.slice/crio-0146d80bbeb84852ac664d370a3a3f9ca3ec02c79fbe37b9c948dfbcd43ab09f WatchSource:0}: Error finding container 0146d80bbeb84852ac664d370a3a3f9ca3ec02c79fbe37b9c948dfbcd43ab09f: Status 404 returned error can't find the container with id 0146d80bbeb84852ac664d370a3a3f9ca3ec02c79fbe37b9c948dfbcd43ab09f Apr 24 23:56:26.518108 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:26.518074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:26.994803 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:26.994768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb66f7d6-zc5fm" event={"ID":"07f709e0-ce17-4501-9585-6b8fb8a4b824","Type":"ContainerStarted","Data":"0146d80bbeb84852ac664d370a3a3f9ca3ec02c79fbe37b9c948dfbcd43ab09f"} Apr 24 23:56:29.002092 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:29.002052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb66f7d6-zc5fm" event={"ID":"07f709e0-ce17-4501-9585-6b8fb8a4b824","Type":"ContainerStarted","Data":"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648"} Apr 24 23:56:29.024450 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:29.024402 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cb66f7d6-zc5fm" podStartSLOduration=1.536496626 podStartE2EDuration="4.024388434s" podCreationTimestamp="2026-04-24 23:56:25 +0000 UTC" firstStartedPulling="2026-04-24 23:56:26.002027335 +0000 UTC m=+147.162976954" lastFinishedPulling="2026-04-24 23:56:28.489919143 +0000 UTC m=+149.650868762" observedRunningTime="2026-04-24 23:56:29.022366259 +0000 UTC m=+150.183315898" watchObservedRunningTime="2026-04-24 23:56:29.024388434 +0000 UTC m=+150.185338076" Apr 24 23:56:34.943459 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.943423 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g"] Apr 24 23:56:34.946784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.946767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:34.949340 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.949311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:56:34.950124 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.950093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:34.950224 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.950105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 23:56:34.950224 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.950146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qcnkn\"" Apr 24 23:56:34.950224 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.950152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:56:34.950224 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.950213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:56:34.956568 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.956549 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g"] Apr 24 23:56:34.977521 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.977500 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2n5fc"] Apr 24 23:56:34.981024 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.981008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:34.983146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.983125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:34.983628 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.983609 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmz5f\"" Apr 24 23:56:34.983809 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.983785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:34.983933 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:34.983890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:35.042591 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-textfile\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-wtmp\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7c9k\" (UniqueName: \"kubernetes.io/projected/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-kube-api-access-r7c9k\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwz4\" (UniqueName: \"kubernetes.io/projected/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-kube-api-access-wlwz4\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-metrics-client-ca\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-root\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.042784 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.043064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-sys\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.043064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.043064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.043064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.043064 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.042980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.143900 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.143870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-textfile\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.143909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-wtmp\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144035 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.143937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7c9k\" (UniqueName: \"kubernetes.io/projected/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-kube-api-access-r7c9k\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.144132 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwz4\" (UniqueName: \"kubernetes.io/projected/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-kube-api-access-wlwz4\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144132 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-metrics-client-ca\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144132 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-wtmp\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-root\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-sys\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-textfile\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-root\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-sys\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.144449 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.144469 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.144513 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls podName:c2beaa31-74bb-4e8e-a7d9-b8b39dc02466 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:35.644496698 +0000 UTC m=+156.805446321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8l75g" (UID: "c2beaa31-74bb-4e8e-a7d9-b8b39dc02466") : secret "openshift-state-metrics-tls" not found Apr 24 23:56:35.144546 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.144528 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls podName:230b41dd-71de-46ef-9417-3bcfa4d0c7ef nodeName:}" failed. No retries permitted until 2026-04-24 23:56:35.644521471 +0000 UTC m=+156.805471092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls") pod "node-exporter-2n5fc" (UID: "230b41dd-71de-46ef-9417-3bcfa4d0c7ef") : secret "node-exporter-tls" not found Apr 24 23:56:35.145057 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-metrics-client-ca\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.145057 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.145057 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.144949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.146684 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.146665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.146937 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.146883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.154749 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.154728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwz4\" (UniqueName: \"kubernetes.io/projected/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-kube-api-access-wlwz4\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.155097 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.155079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7c9k\" (UniqueName: \"kubernetes.io/projected/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-kube-api-access-r7c9k\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.208810 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.208764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j4hmb" podUID="c18a83d5-7d20-4b99-9a28-d4fea36360b1" Apr 24 23:56:35.231752 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:35.231684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gb2jv" podUID="14193e4c-7287-4686-892b-3006e6c02a97" Apr 24 23:56:35.648040 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.648003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.648202 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.648047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.650602 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.650578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/230b41dd-71de-46ef-9417-3bcfa4d0c7ef-node-exporter-tls\") pod \"node-exporter-2n5fc\" (UID: \"230b41dd-71de-46ef-9417-3bcfa4d0c7ef\") " pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.650739 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.650715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2beaa31-74bb-4e8e-a7d9-b8b39dc02466-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l75g\" (UID: \"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.856576 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.856525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" Apr 24 23:56:35.873308 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.873274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:35.873450 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.873326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:35.878516 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.878491 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:35.891654 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.891631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2n5fc" Apr 24 23:56:35.902136 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:35.902099 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230b41dd_71de_46ef_9417_3bcfa4d0c7ef.slice/crio-aaf713ac2971e9faa1bb5817627b7785aefd760dac5b2c22305203d845da262c WatchSource:0}: Error finding container aaf713ac2971e9faa1bb5817627b7785aefd760dac5b2c22305203d845da262c: Status 404 returned error can't find the container with id aaf713ac2971e9faa1bb5817627b7785aefd760dac5b2c22305203d845da262c Apr 24 23:56:35.973802 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.973772 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:35.978647 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.978626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:35.982118 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:56:35.982118 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:56:35.982300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:56:35.982300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:56:35.982300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982235 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:56:35.982300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9r4jj\"" Apr 24 23:56:35.982538 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:56:35.982538 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:56:35.982538 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982515 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:56:35.982726 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.982712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:56:35.995664 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.995645 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:35.998636 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:35.998617 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g"] Apr 24 23:56:36.001898 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:36.001876 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2beaa31_74bb_4e8e_a7d9_b8b39dc02466.slice/crio-8292d3b50ab12a6a2b8cf2b4ab201a4abfb28962563db4e0366a28dbc4cee728 WatchSource:0}: Error finding container 8292d3b50ab12a6a2b8cf2b4ab201a4abfb28962563db4e0366a28dbc4cee728: Status 404 returned error can't find the container with id 8292d3b50ab12a6a2b8cf2b4ab201a4abfb28962563db4e0366a28dbc4cee728 Apr 24 23:56:36.021362 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.021336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2n5fc" event={"ID":"230b41dd-71de-46ef-9417-3bcfa4d0c7ef","Type":"ContainerStarted","Data":"aaf713ac2971e9faa1bb5817627b7785aefd760dac5b2c22305203d845da262c"} Apr 24 23:56:36.022512 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.022453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" event={"ID":"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466","Type":"ContainerStarted","Data":"8292d3b50ab12a6a2b8cf2b4ab201a4abfb28962563db4e0366a28dbc4cee728"} Apr 24 23:56:36.022607 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.022533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4hmb" Apr 24 23:56:36.022607 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.022564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:56:36.026380 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.026362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:56:36.051914 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.051875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052038 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.051940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052038 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.051971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052038 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.051995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052146 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052179 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5t8\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052212 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052212 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052280 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052280 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052339 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052339 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.052406 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.052378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153606 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153606 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153606 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.153852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.153765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.154618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.154939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5t8\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.155751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.156277 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.156233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.158400 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.157528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.158400 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.157626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.158400 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.157795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.158400 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.158344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.160464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.159268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.160464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.159357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.160464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.159941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.160464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.160375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.161026 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.161006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.164379 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.164358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5t8\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8\") pod \"alertmanager-main-0\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.288078 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.288037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:36.424590 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:36.424537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:36.427281 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:36.427248 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb919db65_c87d_4156_9452_d8629ac33ecf.slice/crio-641d27c0f48a81dc145ce676cc87c9b11046be2ea3fdbe502437787ac7a8eb8b WatchSource:0}: Error finding container 641d27c0f48a81dc145ce676cc87c9b11046be2ea3fdbe502437787ac7a8eb8b: Status 404 returned error can't find the container with id 641d27c0f48a81dc145ce676cc87c9b11046be2ea3fdbe502437787ac7a8eb8b Apr 24 23:56:37.027613 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.027554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2n5fc" event={"ID":"230b41dd-71de-46ef-9417-3bcfa4d0c7ef","Type":"ContainerStarted","Data":"59d052beef72ee23318662a21d6b24817c2216493b4b83a380418c4608973e1b"} Apr 24 23:56:37.029222 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.029179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"641d27c0f48a81dc145ce676cc87c9b11046be2ea3fdbe502437787ac7a8eb8b"} Apr 24 23:56:37.031088 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.031059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" event={"ID":"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466","Type":"ContainerStarted","Data":"22a6cc5d00ce8666222b1d202eb7be4272c691e4cf622f5be9716ff3106c7722"} Apr 24 23:56:37.031196 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.031096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" event={"ID":"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466","Type":"ContainerStarted","Data":"8258e31bb86ea5e164e33985dd203e3f0cd6f03ae8c89601bc729c80e99b5ce0"} Apr 24 23:56:37.938440 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.938400 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-696f599485-86lrs"] Apr 24 23:56:37.942495 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.942469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:37.944852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.944825 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 23:56:37.945021 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.944822 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 23:56:37.945021 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.944830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 23:56:37.945021 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.944933 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pv2jq\"" Apr 24 23:56:37.945157 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.945022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-33gad72bnc5i1\"" Apr 24 23:56:37.945241 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.945225 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 23:56:37.945300 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.945250 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 23:56:37.954432 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:37.954409 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-696f599485-86lrs"] Apr 24 23:56:38.035350 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.035317 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="926a3b5244eb19f8df3eca6d665e12c84c5126e3025cbc3d1101c5a9357a4d41" exitCode=0 Apr 24 23:56:38.035768 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.035361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"926a3b5244eb19f8df3eca6d665e12c84c5126e3025cbc3d1101c5a9357a4d41"} Apr 24 23:56:38.037367 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.037341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" event={"ID":"c2beaa31-74bb-4e8e-a7d9-b8b39dc02466","Type":"ContainerStarted","Data":"20c1ee14ed710e157184c1bc78ea9e5c2794c274492ac0685bfeaa3c624c4177"} Apr 24 23:56:38.038599 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.038574 2576 generic.go:358] "Generic (PLEG): container finished" podID="230b41dd-71de-46ef-9417-3bcfa4d0c7ef" containerID="59d052beef72ee23318662a21d6b24817c2216493b4b83a380418c4608973e1b" exitCode=0 Apr 24 23:56:38.038767 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.038612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2n5fc" event={"ID":"230b41dd-71de-46ef-9417-3bcfa4d0c7ef","Type":"ContainerDied","Data":"59d052beef72ee23318662a21d6b24817c2216493b4b83a380418c4608973e1b"} Apr 24 23:56:38.072469 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072560 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072598 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072598 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-grpc-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072683 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072683 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb77e695-4f0e-4120-b4e4-479cb80be577-metrics-client-ca\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072790 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.072790 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.072748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677x8\" (UniqueName: \"kubernetes.io/projected/cb77e695-4f0e-4120-b4e4-479cb80be577-kube-api-access-677x8\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.081469 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.081427 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l75g" podStartSLOduration=2.845101928 podStartE2EDuration="4.081415527s" podCreationTimestamp="2026-04-24 23:56:34 +0000 UTC" firstStartedPulling="2026-04-24 23:56:36.134656223 +0000 UTC m=+157.295605845" lastFinishedPulling="2026-04-24 23:56:37.370969822 +0000 UTC m=+158.531919444" observedRunningTime="2026-04-24 23:56:38.080518542 +0000 UTC m=+159.241468182" watchObservedRunningTime="2026-04-24 23:56:38.081415527 +0000 UTC m=+159.242365145" Apr 24 23:56:38.173816 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.173772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.174017 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.173840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.174085 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.174141 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-grpc-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.174195 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.175879 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb77e695-4f0e-4120-b4e4-479cb80be577-metrics-client-ca\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.175879 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.175879 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.174876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-677x8\" (UniqueName: \"kubernetes.io/projected/cb77e695-4f0e-4120-b4e4-479cb80be577-kube-api-access-677x8\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.175879 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.175415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb77e695-4f0e-4120-b4e4-479cb80be577-metrics-client-ca\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.177032 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.177002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.177157 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.177068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.178548 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.178517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.178661 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.178589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.178746 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.178725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-grpc-tls\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.179178 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.179161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cb77e695-4f0e-4120-b4e4-479cb80be577-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.184119 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.184097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-677x8\" (UniqueName: \"kubernetes.io/projected/cb77e695-4f0e-4120-b4e4-479cb80be577-kube-api-access-677x8\") pod \"thanos-querier-696f599485-86lrs\" (UID: \"cb77e695-4f0e-4120-b4e4-479cb80be577\") " pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.251567 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.251540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:38.387157 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:38.386978 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-696f599485-86lrs"] Apr 24 23:56:38.389864 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:38.389838 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb77e695_4f0e_4120_b4e4_479cb80be577.slice/crio-d9dda518717c02a70aede20bef67457c7da2cabf95ebd765f88f1f1c910cec00 WatchSource:0}: Error finding container d9dda518717c02a70aede20bef67457c7da2cabf95ebd765f88f1f1c910cec00: Status 404 returned error can't find the container with id d9dda518717c02a70aede20bef67457c7da2cabf95ebd765f88f1f1c910cec00 Apr 24 23:56:39.043344 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.043300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"d9dda518717c02a70aede20bef67457c7da2cabf95ebd765f88f1f1c910cec00"} Apr 24 23:56:39.046229 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.046196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2n5fc" event={"ID":"230b41dd-71de-46ef-9417-3bcfa4d0c7ef","Type":"ContainerStarted","Data":"cab38fa7837b0d318c46bf6dfdc51813f1145326cfba9a4956ac36e0e535c129"} Apr 24 23:56:39.046354 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.046235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2n5fc" event={"ID":"230b41dd-71de-46ef-9417-3bcfa4d0c7ef","Type":"ContainerStarted","Data":"be235812186427bb7b6af613075697425ad0a78405985a4829dfd02ed20f5c32"} Apr 24 23:56:39.066962 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.066908 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2n5fc" podStartSLOduration=4.238464795 podStartE2EDuration="5.066889413s" podCreationTimestamp="2026-04-24 23:56:34 +0000 UTC" firstStartedPulling="2026-04-24 23:56:35.904329572 +0000 UTC m=+157.065279202" lastFinishedPulling="2026-04-24 23:56:36.732754186 +0000 UTC m=+157.893703820" observedRunningTime="2026-04-24 23:56:39.065140022 +0000 UTC m=+160.226089665" watchObservedRunningTime="2026-04-24 23:56:39.066889413 +0000 UTC m=+160.227839057" Apr 24 23:56:39.640290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.640265 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt"] Apr 24 23:56:39.644483 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.644466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:39.646592 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.646567 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-b4848\"" Apr 24 23:56:39.646711 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.646570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 23:56:39.654141 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.654121 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt"] Apr 24 23:56:39.690758 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.690735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6hzlt\" (UID: \"504b17af-7b61-4156-bbc0-1a95c7919e51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:39.791940 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:39.791908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6hzlt\" (UID: \"504b17af-7b61-4156-bbc0-1a95c7919e51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:39.792094 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:39.792023 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 23:56:39.792158 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:56:39.792095 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert podName:504b17af-7b61-4156-bbc0-1a95c7919e51 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:40.29207465 +0000 UTC m=+161.453024273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-6hzlt" (UID: "504b17af-7b61-4156-bbc0-1a95c7919e51") : secret "monitoring-plugin-cert" not found Apr 24 23:56:40.052310 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.052264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"c9743eda73e7cb7a0811b84e16b14e55df9c8b452b72e28a0a7179ef15f825f1"} Apr 24 23:56:40.052810 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.052319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"55b3db262a48e89e3196fced6d5fe201e7f4a60ec51f62e3c88cfd2203b011ff"} Apr 24 23:56:40.052810 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.052335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"f6fb96325651da369f6ee25b3e2d003155e03b0c04f6d332f918dc9cd685a28f"} Apr 24 23:56:40.052810 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.052348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"de00943ffada5b50d990160dcfb18e669f0cc5cd45242f59bb04e6865b13b264"} Apr 24 23:56:40.052810 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.052361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"db6760491a736da69e4e4e284a46841099b769af85f970092d669eeada6e850a"} Apr 24 23:56:40.094247 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.093772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:56:40.094247 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.093980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:56:40.097765 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.097525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18a83d5-7d20-4b99-9a28-d4fea36360b1-metrics-tls\") pod \"dns-default-j4hmb\" (UID: \"c18a83d5-7d20-4b99-9a28-d4fea36360b1\") " pod="openshift-dns/dns-default-j4hmb" Apr 24 23:56:40.097933 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.097824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14193e4c-7287-4686-892b-3006e6c02a97-cert\") pod \"ingress-canary-gb2jv\" (UID: \"14193e4c-7287-4686-892b-3006e6c02a97\") " pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:56:40.226098 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.226065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:56:40.226279 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.226065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:56:40.234382 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.234337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gb2jv" Apr 24 23:56:40.234493 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.234444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4hmb" Apr 24 23:56:40.295973 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.295939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6hzlt\" (UID: \"504b17af-7b61-4156-bbc0-1a95c7919e51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:40.299380 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.299341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/504b17af-7b61-4156-bbc0-1a95c7919e51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6hzlt\" (UID: \"504b17af-7b61-4156-bbc0-1a95c7919e51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:40.419956 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.419926 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gb2jv"] Apr 24 23:56:40.421822 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:40.421797 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14193e4c_7287_4686_892b_3006e6c02a97.slice/crio-bfbce8dd3644e1213a364be077ffc07dad8869db579f099d2cb86d848f6faf35 WatchSource:0}: Error finding container bfbce8dd3644e1213a364be077ffc07dad8869db579f099d2cb86d848f6faf35: Status 404 returned error can't find the container with id bfbce8dd3644e1213a364be077ffc07dad8869db579f099d2cb86d848f6faf35 Apr 24 23:56:40.438687 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.438603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4hmb"] Apr 24 23:56:40.443543 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:40.443499 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18a83d5_7d20_4b99_9a28_d4fea36360b1.slice/crio-41317019d749ada8d26615c074b1c3293f0b2f61b7845c4ca159aa78e28dea4c WatchSource:0}: Error finding container 41317019d749ada8d26615c074b1c3293f0b2f61b7845c4ca159aa78e28dea4c: Status 404 returned error can't find the container with id 41317019d749ada8d26615c074b1c3293f0b2f61b7845c4ca159aa78e28dea4c Apr 24 23:56:40.567923 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.567894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:40.701180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:40.701148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt"] Apr 24 23:56:40.730863 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:40.730832 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504b17af_7b61_4156_bbc0_1a95c7919e51.slice/crio-2be9bc0cc5d713148d73e2255d4628e80e9381e30d2427f8369ddb2fdd60f4ae WatchSource:0}: Error finding container 2be9bc0cc5d713148d73e2255d4628e80e9381e30d2427f8369ddb2fdd60f4ae: Status 404 returned error can't find the container with id 2be9bc0cc5d713148d73e2255d4628e80e9381e30d2427f8369ddb2fdd60f4ae Apr 24 23:56:41.060808 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.060750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerStarted","Data":"f8e5472306032bcbc034d5a22a178ad1ddc1aadf102b294eb0ae3b4ee2528a9a"} Apr 24 23:56:41.065006 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.064925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"f1f484102fea02e9fa3f0943f635b257102b17f25ee0f08baa5019f48477c0c6"} Apr 24 23:56:41.065006 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.064961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"9996a054abdb7018321e4400efe6f364caa71fd386f55d2cdb8aaa3546610415"} Apr 24 23:56:41.065006 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.064974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"28de232371fd851ce1c121a1ba7f3e9ece205570782ab0918c9a894c99b02695"} Apr 24 23:56:41.065006 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.064986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"f251628db369d0470419573b260dab10bc9096ce535dc664520267a25245d7fe"} Apr 24 23:56:41.066326 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.066261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gb2jv" event={"ID":"14193e4c-7287-4686-892b-3006e6c02a97","Type":"ContainerStarted","Data":"bfbce8dd3644e1213a364be077ffc07dad8869db579f099d2cb86d848f6faf35"} Apr 24 23:56:41.067564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.067540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4hmb" event={"ID":"c18a83d5-7d20-4b99-9a28-d4fea36360b1","Type":"ContainerStarted","Data":"41317019d749ada8d26615c074b1c3293f0b2f61b7845c4ca159aa78e28dea4c"} Apr 24 23:56:41.068533 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.068508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" event={"ID":"504b17af-7b61-4156-bbc0-1a95c7919e51","Type":"ContainerStarted","Data":"2be9bc0cc5d713148d73e2255d4628e80e9381e30d2427f8369ddb2fdd60f4ae"} Apr 24 23:56:41.532443 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.532392 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerName="registry" containerID="cri-o://878bf82dbcee335b8d439e575d91f80ea3bcb5ab92375f7b82ea9e6ad9c2606a" gracePeriod=30 Apr 24 23:56:41.561752 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.560641 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.215058666 podStartE2EDuration="6.560615636s" podCreationTimestamp="2026-04-24 23:56:35 +0000 UTC" firstStartedPulling="2026-04-24 23:56:36.429468623 +0000 UTC m=+157.590418243" lastFinishedPulling="2026-04-24 23:56:40.775025594 +0000 UTC m=+161.935975213" observedRunningTime="2026-04-24 23:56:41.088927124 +0000 UTC m=+162.249876766" watchObservedRunningTime="2026-04-24 23:56:41.560615636 +0000 UTC m=+162.721565278" Apr 24 23:56:41.562266 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.562241 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:56:41.565853 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.565820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.574343 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.574307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:56:41.580238 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.580214 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:56:41.610148 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610269 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610269 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7gsx\" (UniqueName: \"kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610269 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610269 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610488 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.610488 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.610367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711464 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7gsx\" (UniqueName: \"kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711777 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711777 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711877 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.711877 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.711823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.712236 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.712205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.712522 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.712498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.712752 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.712685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.713028 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.713002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.714745 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.714725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.714842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.714819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.719798 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.719773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7gsx\" (UniqueName: \"kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx\") pod \"console-6f86db465c-nl8nl\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:41.894501 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:41.894423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:42.075718 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.075651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"ac251e3611c58d1f66f5c77589206fad54c98ec0fb27ed9d777b43539bcae1cd"} Apr 24 23:56:42.075718 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.075721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" event={"ID":"cb77e695-4f0e-4120-b4e4-479cb80be577","Type":"ContainerStarted","Data":"d5c4df4989a817ea2a23045fbe97b8a2ec1ff3456b2a8bb0f299829fc93f19a7"} Apr 24 23:56:42.076246 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.075855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:42.077232 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.077204 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerID="878bf82dbcee335b8d439e575d91f80ea3bcb5ab92375f7b82ea9e6ad9c2606a" exitCode=0 Apr 24 23:56:42.077352 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.077242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" event={"ID":"5c5fbc58-8768-4fe2-80b6-18689310ec18","Type":"ContainerDied","Data":"878bf82dbcee335b8d439e575d91f80ea3bcb5ab92375f7b82ea9e6ad9c2606a"} Apr 24 23:56:42.100947 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.100882 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" podStartSLOduration=2.593495758 podStartE2EDuration="5.100866283s" podCreationTimestamp="2026-04-24 23:56:37 +0000 UTC" firstStartedPulling="2026-04-24 23:56:38.391813364 +0000 UTC m=+159.552762984" lastFinishedPulling="2026-04-24 23:56:40.899183875 +0000 UTC m=+162.060133509" observedRunningTime="2026-04-24 23:56:42.09909591 +0000 UTC m=+163.260045551" watchObservedRunningTime="2026-04-24 23:56:42.100866283 +0000 UTC m=+163.261815957" Apr 24 23:56:42.783544 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.783520 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:42.925336 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925288 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925353 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fx2g\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925400 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925443 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925468 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925917 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.925917 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.925627 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca\") pod \"5c5fbc58-8768-4fe2-80b6-18689310ec18\" (UID: \"5c5fbc58-8768-4fe2-80b6-18689310ec18\") " Apr 24 23:56:42.926832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.926313 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:42.926832 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.926786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:42.930572 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.930491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:42.930572 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.930548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g" (OuterVolumeSpecName: "kube-api-access-9fx2g") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "kube-api-access-9fx2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:42.933495 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.933447 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:42.933802 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.933766 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:42.933969 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.933937 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:42.938962 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.938932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5c5fbc58-8768-4fe2-80b6-18689310ec18" (UID: "5c5fbc58-8768-4fe2-80b6-18689310ec18"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:56:42.960256 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:42.960235 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:56:42.962523 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:56:42.962495 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c5df2c_e0bf_49b6_8272_0817e3902d6d.slice/crio-3c14ced5cd71f83b3ac04c5ee2152a3ad42e0f94d0e0fa6a7dca590881c62401 WatchSource:0}: Error finding container 3c14ced5cd71f83b3ac04c5ee2152a3ad42e0f94d0e0fa6a7dca590881c62401: Status 404 returned error can't find the container with id 3c14ced5cd71f83b3ac04c5ee2152a3ad42e0f94d0e0fa6a7dca590881c62401 Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027481 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-certificates\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027509 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fbc58-8768-4fe2-80b6-18689310ec18-trusted-ca\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027523 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c5fbc58-8768-4fe2-80b6-18689310ec18-ca-trust-extracted\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027535 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9fx2g\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-kube-api-access-9fx2g\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027549 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-registry-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027599 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-installation-pull-secrets\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.027620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027612 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c5fbc58-8768-4fe2-80b6-18689310ec18-bound-sa-token\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.028012 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.027626 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5c5fbc58-8768-4fe2-80b6-18689310ec18-image-registry-private-configuration\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:56:43.082732 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.082649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" event={"ID":"5c5fbc58-8768-4fe2-80b6-18689310ec18","Type":"ContainerDied","Data":"649ec64f1f3b09be039c5d7df03a7c22d82abc4246e666aef796e382d92e42cb"} Apr 24 23:56:43.082732 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.082738 2576 scope.go:117] "RemoveContainer" containerID="878bf82dbcee335b8d439e575d91f80ea3bcb5ab92375f7b82ea9e6ad9c2606a" Apr 24 23:56:43.083238 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.082917 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65f5954f47-n9ftv" Apr 24 23:56:43.085493 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.085455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gb2jv" event={"ID":"14193e4c-7287-4686-892b-3006e6c02a97","Type":"ContainerStarted","Data":"671e1faf866b9fbebf97d3de92b8604373bc1720b19187c8a4762fd56f4c95ce"} Apr 24 23:56:43.087327 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.087268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4hmb" event={"ID":"c18a83d5-7d20-4b99-9a28-d4fea36360b1","Type":"ContainerStarted","Data":"1e3f639e77d653357ecff878c47d4a72f4c8a9c1bf470be6956709c215809172"} Apr 24 23:56:43.088680 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.088652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f86db465c-nl8nl" event={"ID":"12c5df2c-e0bf-49b6-8272-0817e3902d6d","Type":"ContainerStarted","Data":"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc"} Apr 24 23:56:43.088803 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.088706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f86db465c-nl8nl" event={"ID":"12c5df2c-e0bf-49b6-8272-0817e3902d6d","Type":"ContainerStarted","Data":"3c14ced5cd71f83b3ac04c5ee2152a3ad42e0f94d0e0fa6a7dca590881c62401"} Apr 24 23:56:43.090538 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.090473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" event={"ID":"504b17af-7b61-4156-bbc0-1a95c7919e51","Type":"ContainerStarted","Data":"997ab9046bd4de547c1436a3b5472d0a7f5321a859565834cfde463a002230af"} Apr 24 23:56:43.091082 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.091044 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:43.098807 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.098769 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" Apr 24 23:56:43.103238 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.103190 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gb2jv" podStartSLOduration=128.722712658 podStartE2EDuration="2m11.103176161s" podCreationTimestamp="2026-04-24 23:54:32 +0000 UTC" firstStartedPulling="2026-04-24 23:56:40.424188672 +0000 UTC m=+161.585138294" lastFinishedPulling="2026-04-24 23:56:42.804652165 +0000 UTC m=+163.965601797" observedRunningTime="2026-04-24 23:56:43.103158835 +0000 UTC m=+164.264108477" watchObservedRunningTime="2026-04-24 23:56:43.103176161 +0000 UTC m=+164.264125813" Apr 24 23:56:43.119781 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.119735 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6hzlt" podStartSLOduration=2.046374851 podStartE2EDuration="4.119719117s" podCreationTimestamp="2026-04-24 23:56:39 +0000 UTC" firstStartedPulling="2026-04-24 23:56:40.732529687 +0000 UTC m=+161.893479309" lastFinishedPulling="2026-04-24 23:56:42.805873941 +0000 UTC m=+163.966823575" observedRunningTime="2026-04-24 23:56:43.118224095 +0000 UTC m=+164.279173736" watchObservedRunningTime="2026-04-24 23:56:43.119719117 +0000 UTC m=+164.280668755" Apr 24 23:56:43.143924 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.143895 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:56:43.150797 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.150749 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65f5954f47-n9ftv"] Apr 24 23:56:43.164639 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.164590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f86db465c-nl8nl" podStartSLOduration=2.1645755 podStartE2EDuration="2.1645755s" podCreationTimestamp="2026-04-24 23:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:43.163138326 +0000 UTC m=+164.324087965" watchObservedRunningTime="2026-04-24 23:56:43.1645755 +0000 UTC m=+164.325525140" Apr 24 23:56:43.400634 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:43.400555 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" path="/var/lib/kubelet/pods/5c5fbc58-8768-4fe2-80b6-18689310ec18/volumes" Apr 24 23:56:44.096463 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:44.096365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4hmb" event={"ID":"c18a83d5-7d20-4b99-9a28-d4fea36360b1","Type":"ContainerStarted","Data":"636d7f62b6a4e3901b13938b23231876cfb0f6a75cc41c48f0d4dcfc3c7b74af"} Apr 24 23:56:44.096904 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:44.096883 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j4hmb" Apr 24 23:56:44.113391 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:44.113341 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j4hmb" podStartSLOduration=129.754972681 podStartE2EDuration="2m12.11330394s" podCreationTimestamp="2026-04-24 23:54:32 +0000 UTC" firstStartedPulling="2026-04-24 23:56:40.44634479 +0000 UTC m=+161.607294411" lastFinishedPulling="2026-04-24 23:56:42.804676044 +0000 UTC m=+163.965625670" observedRunningTime="2026-04-24 23:56:44.111959753 +0000 UTC m=+165.272909394" watchObservedRunningTime="2026-04-24 23:56:44.11330394 +0000 UTC m=+165.274253581" Apr 24 23:56:48.096349 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:48.096311 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-696f599485-86lrs" Apr 24 23:56:51.895155 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:51.895120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:51.895155 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:51.895165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:51.899757 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:51.899735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:52.126816 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:52.126789 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:56:52.181117 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:52.181040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:56:54.102716 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:56:54.102666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j4hmb" Apr 24 23:57:06.169246 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:06.169211 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc0a1f9d-aade-4d80-a5b8-fbc8542431a7" containerID="22776efa65079d6a9bb87ef6ab06271083e1c9ea172b87fb38975f512bda1e1c" exitCode=0 Apr 24 23:57:06.169620 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:06.169288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" event={"ID":"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7","Type":"ContainerDied","Data":"22776efa65079d6a9bb87ef6ab06271083e1c9ea172b87fb38975f512bda1e1c"} Apr 24 23:57:06.169665 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:06.169629 2576 scope.go:117] "RemoveContainer" containerID="22776efa65079d6a9bb87ef6ab06271083e1c9ea172b87fb38975f512bda1e1c" Apr 24 23:57:07.174535 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:07.174499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpjq8" event={"ID":"bc0a1f9d-aade-4d80-a5b8-fbc8542431a7","Type":"ContainerStarted","Data":"f1a26dfba2bc40fad268c3e78e541c709443a9f26035013298562ccb68e29ef0"} Apr 24 23:57:16.204214 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:16.204107 2576 generic.go:358] "Generic (PLEG): container finished" podID="d5c69f72-f063-42af-a243-45f740a1ea73" containerID="d9c570de223238ef64867607586f8d8def2d02a3c00fba774b626416c373f0b3" exitCode=0 Apr 24 23:57:16.204644 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:16.204211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" event={"ID":"d5c69f72-f063-42af-a243-45f740a1ea73","Type":"ContainerDied","Data":"d9c570de223238ef64867607586f8d8def2d02a3c00fba774b626416c373f0b3"} Apr 24 23:57:16.204745 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:16.204650 2576 scope.go:117] "RemoveContainer" containerID="d9c570de223238ef64867607586f8d8def2d02a3c00fba774b626416c373f0b3" Apr 24 23:57:17.207095 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.207026 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cb66f7d6-zc5fm" podUID="07f709e0-ce17-4501-9585-6b8fb8a4b824" containerName="console" containerID="cri-o://b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648" gracePeriod=15 Apr 24 23:57:17.208788 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.208761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfnrq" event={"ID":"d5c69f72-f063-42af-a243-45f740a1ea73","Type":"ContainerStarted","Data":"6748f2e3db86b4c35e7b0eb09046669f2d1cce05bc9c0071ed952c5272941100"} Apr 24 23:57:17.451744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.451719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cb66f7d6-zc5fm_07f709e0-ce17-4501-9585-6b8fb8a4b824/console/0.log" Apr 24 23:57:17.451852 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.451779 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:57:17.514063 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514027 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514230 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514230 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514230 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514230 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqf4w\" (UniqueName: \"kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514230 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert\") pod \"07f709e0-ce17-4501-9585-6b8fb8a4b824\" (UID: \"07f709e0-ce17-4501-9585-6b8fb8a4b824\") " Apr 24 23:57:17.514637 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514603 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config" (OuterVolumeSpecName: "console-config") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:17.514780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca" (OuterVolumeSpecName: "service-ca") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:17.514780 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.514748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:17.516540 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.516515 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:17.516848 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.516825 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:17.516923 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.516863 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w" (OuterVolumeSpecName: "kube-api-access-cqf4w") pod "07f709e0-ce17-4501-9585-6b8fb8a4b824" (UID: "07f709e0-ce17-4501-9585-6b8fb8a4b824"). InnerVolumeSpecName "kube-api-access-cqf4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:17.614977 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.614951 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-oauth-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:17.614977 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.614974 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqf4w\" (UniqueName: \"kubernetes.io/projected/07f709e0-ce17-4501-9585-6b8fb8a4b824-kube-api-access-cqf4w\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:17.615133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.614985 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-oauth-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:17.615133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.614994 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:17.615133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.615004 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-service-ca\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:17.615133 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:17.615013 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07f709e0-ce17-4501-9585-6b8fb8a4b824-console-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:18.212553 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cb66f7d6-zc5fm_07f709e0-ce17-4501-9585-6b8fb8a4b824/console/0.log" Apr 24 23:57:18.212954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212569 2576 generic.go:358] "Generic (PLEG): container finished" podID="07f709e0-ce17-4501-9585-6b8fb8a4b824" containerID="b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648" exitCode=2 Apr 24 23:57:18.212954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212644 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb66f7d6-zc5fm" Apr 24 23:57:18.212954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb66f7d6-zc5fm" event={"ID":"07f709e0-ce17-4501-9585-6b8fb8a4b824","Type":"ContainerDied","Data":"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648"} Apr 24 23:57:18.212954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb66f7d6-zc5fm" event={"ID":"07f709e0-ce17-4501-9585-6b8fb8a4b824","Type":"ContainerDied","Data":"0146d80bbeb84852ac664d370a3a3f9ca3ec02c79fbe37b9c948dfbcd43ab09f"} Apr 24 23:57:18.212954 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.212727 2576 scope.go:117] "RemoveContainer" containerID="b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648" Apr 24 23:57:18.221485 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.221464 2576 scope.go:117] "RemoveContainer" containerID="b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648" Apr 24 23:57:18.222038 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:57:18.222010 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648\": container with ID starting with b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648 not found: ID does not exist" containerID="b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648" Apr 24 23:57:18.222130 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.222045 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648"} err="failed to get container status \"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648\": rpc error: code = NotFound desc = could not find container \"b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648\": container with ID starting with b5ae2a5cbf8e59a4a02b21742a85061c994c3b50de5547e10504c74780229648 not found: ID does not exist" Apr 24 23:57:18.233272 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.233239 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:57:18.237067 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:18.237043 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cb66f7d6-zc5fm"] Apr 24 23:57:19.399040 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:19.399011 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f709e0-ce17-4501-9585-6b8fb8a4b824" path="/var/lib/kubelet/pods/07f709e0-ce17-4501-9585-6b8fb8a4b824/volumes" Apr 24 23:57:26.239825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:26.239790 2576 generic.go:358] "Generic (PLEG): container finished" podID="34e5790d-4147-4de2-8280-8a4d156daee6" containerID="6620fd58674c33a5f8180d845e73a9ad771612aba2d483909b5200494c6997b5" exitCode=0 Apr 24 23:57:26.240245 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:26.239866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" event={"ID":"34e5790d-4147-4de2-8280-8a4d156daee6","Type":"ContainerDied","Data":"6620fd58674c33a5f8180d845e73a9ad771612aba2d483909b5200494c6997b5"} Apr 24 23:57:26.240245 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:26.240178 2576 scope.go:117] "RemoveContainer" containerID="6620fd58674c33a5f8180d845e73a9ad771612aba2d483909b5200494c6997b5" Apr 24 23:57:27.244546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:27.244511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gtxb2" event={"ID":"34e5790d-4147-4de2-8280-8a4d156daee6","Type":"ContainerStarted","Data":"7ef8ff7250b8749ef58dc1a8e2add6bc9c8af40e4f5b32d71af1b2154998bdd5"} Apr 24 23:57:55.207980 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.207940 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:55.208385 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208325 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="alertmanager" containerID="cri-o://db6760491a736da69e4e4e284a46841099b769af85f970092d669eeada6e850a" gracePeriod=120 Apr 24 23:57:55.208441 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208392 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-web" containerID="cri-o://f6fb96325651da369f6ee25b3e2d003155e03b0c04f6d332f918dc9cd685a28f" gracePeriod=120 Apr 24 23:57:55.208441 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208395 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-metric" containerID="cri-o://c9743eda73e7cb7a0811b84e16b14e55df9c8b452b72e28a0a7179ef15f825f1" gracePeriod=120 Apr 24 23:57:55.208529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208457 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy" containerID="cri-o://55b3db262a48e89e3196fced6d5fe201e7f4a60ec51f62e3c88cfd2203b011ff" gracePeriod=120 Apr 24 23:57:55.208529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208458 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="config-reloader" containerID="cri-o://de00943ffada5b50d990160dcfb18e669f0cc5cd45242f59bb04e6865b13b264" gracePeriod=120 Apr 24 23:57:55.208529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.208440 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="prom-label-proxy" containerID="cri-o://f8e5472306032bcbc034d5a22a178ad1ddc1aadf102b294eb0ae3b4ee2528a9a" gracePeriod=120 Apr 24 23:57:55.327842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327814 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="f8e5472306032bcbc034d5a22a178ad1ddc1aadf102b294eb0ae3b4ee2528a9a" exitCode=0 Apr 24 23:57:55.327842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327838 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="55b3db262a48e89e3196fced6d5fe201e7f4a60ec51f62e3c88cfd2203b011ff" exitCode=0 Apr 24 23:57:55.327842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327844 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="de00943ffada5b50d990160dcfb18e669f0cc5cd45242f59bb04e6865b13b264" exitCode=0 Apr 24 23:57:55.327993 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327850 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="db6760491a736da69e4e4e284a46841099b769af85f970092d669eeada6e850a" exitCode=0 Apr 24 23:57:55.327993 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"f8e5472306032bcbc034d5a22a178ad1ddc1aadf102b294eb0ae3b4ee2528a9a"} Apr 24 23:57:55.327993 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"55b3db262a48e89e3196fced6d5fe201e7f4a60ec51f62e3c88cfd2203b011ff"} Apr 24 23:57:55.327993 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"de00943ffada5b50d990160dcfb18e669f0cc5cd45242f59bb04e6865b13b264"} Apr 24 23:57:55.327993 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:55.327912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"db6760491a736da69e4e4e284a46841099b769af85f970092d669eeada6e850a"} Apr 24 23:57:56.335031 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.335001 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="c9743eda73e7cb7a0811b84e16b14e55df9c8b452b72e28a0a7179ef15f825f1" exitCode=0 Apr 24 23:57:56.335031 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.335029 2576 generic.go:358] "Generic (PLEG): container finished" podID="b919db65-c87d-4156-9452-d8629ac33ecf" containerID="f6fb96325651da369f6ee25b3e2d003155e03b0c04f6d332f918dc9cd685a28f" exitCode=0 Apr 24 23:57:56.335352 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.335061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"c9743eda73e7cb7a0811b84e16b14e55df9c8b452b72e28a0a7179ef15f825f1"} Apr 24 23:57:56.335352 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.335086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"f6fb96325651da369f6ee25b3e2d003155e03b0c04f6d332f918dc9cd685a28f"} Apr 24 23:57:56.450786 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.450762 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:56.533001 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.532905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533001 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.532969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533008 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533030 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533052 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533116 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm5t8\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533225 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533192 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533244 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533347 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out\") pod \"b919db65-c87d-4156-9452-d8629ac33ecf\" (UID: \"b919db65-c87d-4156-9452-d8629ac33ecf\") " Apr 24 23:57:56.533564 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533529 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:56.533851 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.533650 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-metrics-client-ca\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.535144 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.534608 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:56.535144 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.535082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:56.536107 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.536056 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.536558 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.536521 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.536856 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.536825 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out" (OuterVolumeSpecName: "config-out") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:56.536986 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.536852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.536986 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.536871 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume" (OuterVolumeSpecName: "config-volume") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.538072 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.538045 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8" (OuterVolumeSpecName: "kube-api-access-nm5t8") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "kube-api-access-nm5t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:56.538415 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.538393 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.538589 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.538563 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:56.541285 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.541241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.547830 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.547808 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config" (OuterVolumeSpecName: "web-config") pod "b919db65-c87d-4156-9452-d8629ac33ecf" (UID: "b919db65-c87d-4156-9452-d8629ac33ecf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:56.634385 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634334 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634385 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634380 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634385 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634392 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-tls-assets\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634385 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634402 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-alertmanager-main-db\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634411 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b919db65-c87d-4156-9452-d8629ac33ecf-config-out\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634420 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-config-volume\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634428 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634438 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634448 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-web-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634457 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-secret-alertmanager-main-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634465 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b919db65-c87d-4156-9452-d8629ac33ecf-cluster-tls-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:56.634631 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:56.634474 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nm5t8\" (UniqueName: \"kubernetes.io/projected/b919db65-c87d-4156-9452-d8629ac33ecf-kube-api-access-nm5t8\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:57:57.340789 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.340752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b919db65-c87d-4156-9452-d8629ac33ecf","Type":"ContainerDied","Data":"641d27c0f48a81dc145ce676cc87c9b11046be2ea3fdbe502437787ac7a8eb8b"} Apr 24 23:57:57.340789 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.340780 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.341231 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.340808 2576 scope.go:117] "RemoveContainer" containerID="f8e5472306032bcbc034d5a22a178ad1ddc1aadf102b294eb0ae3b4ee2528a9a" Apr 24 23:57:57.349209 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.349190 2576 scope.go:117] "RemoveContainer" containerID="c9743eda73e7cb7a0811b84e16b14e55df9c8b452b72e28a0a7179ef15f825f1" Apr 24 23:57:57.356263 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.356249 2576 scope.go:117] "RemoveContainer" containerID="55b3db262a48e89e3196fced6d5fe201e7f4a60ec51f62e3c88cfd2203b011ff" Apr 24 23:57:57.363440 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.363421 2576 scope.go:117] "RemoveContainer" containerID="f6fb96325651da369f6ee25b3e2d003155e03b0c04f6d332f918dc9cd685a28f" Apr 24 23:57:57.364259 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.364237 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:57.370176 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.370160 2576 scope.go:117] "RemoveContainer" containerID="de00943ffada5b50d990160dcfb18e669f0cc5cd45242f59bb04e6865b13b264" Apr 24 23:57:57.373023 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.373002 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:57.377251 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.377238 2576 scope.go:117] "RemoveContainer" containerID="db6760491a736da69e4e4e284a46841099b769af85f970092d669eeada6e850a" Apr 24 23:57:57.384416 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.384397 2576 scope.go:117] "RemoveContainer" containerID="926a3b5244eb19f8df3eca6d665e12c84c5126e3025cbc3d1101c5a9357a4d41" Apr 24 23:57:57.400964 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.400940 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" path="/var/lib/kubelet/pods/b919db65-c87d-4156-9452-d8629ac33ecf/volumes" Apr 24 23:57:57.401477 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401459 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:57.401803 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401787 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-web" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401806 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-web" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401819 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401826 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401835 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-metric" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401842 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-metric" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401863 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="prom-label-proxy" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401872 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="prom-label-proxy" Apr 24 23:57:57.401885 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401884 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerName="registry" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401892 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerName="registry" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401901 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07f709e0-ce17-4501-9585-6b8fb8a4b824" containerName="console" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401910 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f709e0-ce17-4501-9585-6b8fb8a4b824" containerName="console" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401920 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="config-reloader" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401928 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="config-reloader" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="init-config-reloader" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401952 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="init-config-reloader" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="alertmanager" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.401977 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="alertmanager" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402053 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-metric" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402068 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="07f709e0-ce17-4501-9585-6b8fb8a4b824" containerName="console" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402078 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="config-reloader" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402093 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy-web" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402103 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="alertmanager" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402113 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="prom-label-proxy" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402124 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b919db65-c87d-4156-9452-d8629ac33ecf" containerName="kube-rbac-proxy" Apr 24 23:57:57.402253 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.402133 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c5fbc58-8768-4fe2-80b6-18689310ec18" containerName="registry" Apr 24 23:57:57.407476 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.407458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.409882 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.409864 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.409977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.409995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410118 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:57:57.410184 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:57:57.410529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:57:57.410529 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.410441 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9r4jj\"" Apr 24 23:57:57.415353 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.415326 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:57:57.417621 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.417600 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:57.542046 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.541954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542046 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.541997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-web-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542046 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhxq9\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-kube-api-access-lhxq9\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-out\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542309 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542626 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542626 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.542626 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.542499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643265 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643265 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643504 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643504 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-web-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643504 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643504 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhxq9\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-kube-api-access-lhxq9\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-out\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.643744 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.644033 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.644033 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.644033 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.643817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.644178 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.644146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.645361 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.645327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/787e6d5d-d8c4-411c-8182-e1c27fa743f8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.646947 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.646882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.646947 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.646882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-volume\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647365 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-web-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647365 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647365 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647606 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/787e6d5d-d8c4-411c-8182-e1c27fa743f8-config-out\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647797 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.647906 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.647890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.648377 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.648360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/787e6d5d-d8c4-411c-8182-e1c27fa743f8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.651970 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.651954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhxq9\" (UniqueName: \"kubernetes.io/projected/787e6d5d-d8c4-411c-8182-e1c27fa743f8-kube-api-access-lhxq9\") pod \"alertmanager-main-0\" (UID: \"787e6d5d-d8c4-411c-8182-e1c27fa743f8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.718533 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.718511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:57:57.847834 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:57.847801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:57:57.851064 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:57:57.851030 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod787e6d5d_d8c4_411c_8182_e1c27fa743f8.slice/crio-95865c90b39c5a55edca8f5d19f2d16d1066828ded2c5aa65a087263977745c8 WatchSource:0}: Error finding container 95865c90b39c5a55edca8f5d19f2d16d1066828ded2c5aa65a087263977745c8: Status 404 returned error can't find the container with id 95865c90b39c5a55edca8f5d19f2d16d1066828ded2c5aa65a087263977745c8 Apr 24 23:57:58.346258 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:58.346220 2576 generic.go:358] "Generic (PLEG): container finished" podID="787e6d5d-d8c4-411c-8182-e1c27fa743f8" containerID="582ff0ba51a3a5649418b8e9e5422dcf3e3a65d73a35aa94ebfd8a0dc0139eeb" exitCode=0 Apr 24 23:57:58.346614 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:58.346302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerDied","Data":"582ff0ba51a3a5649418b8e9e5422dcf3e3a65d73a35aa94ebfd8a0dc0139eeb"} Apr 24 23:57:58.346614 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:58.346336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"95865c90b39c5a55edca8f5d19f2d16d1066828ded2c5aa65a087263977745c8"} Apr 24 23:57:59.238649 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.238617 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz"] Apr 24 23:57:59.242257 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.242237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.244828 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244803 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 23:57:59.244955 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244847 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 23:57:59.244955 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244875 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bvwx5\"" Apr 24 23:57:59.244955 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244883 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 23:57:59.244955 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 23:57:59.245163 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.244973 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 23:57:59.250471 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.250454 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 23:57:59.257846 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.257828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz"] Apr 24 23:57:59.352369 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"8eeb4e8c343c960f91d1248b1d9c049ac3e54beb50f886421303a25cb4a41548"} Apr 24 23:57:59.352655 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"0df6caa2a1f7631a3cd02ca60ae1360b9fdc8c0864bd0050a06fdc2e96f97727"} Apr 24 23:57:59.352655 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"701f103bb5f605df0b0374c3805c7da8c57a9b9b8369ad163aa4e48a711af57e"} Apr 24 23:57:59.352655 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"95295e3da229825857c7217e4db14b47f63dbce910efc67ef90076264f0fb56d"} Apr 24 23:57:59.352655 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352402 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"d5da3c5689a3a5f7e9d9bfaf9bd35b662b070f4fab6dcd4babb6599678e52fa5"} Apr 24 23:57:59.352655 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.352410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"787e6d5d-d8c4-411c-8182-e1c27fa743f8","Type":"ContainerStarted","Data":"d064b743f29d2e1d05aa6b3fd9e6cb9350538ce1657a40ac392ce9e9dd4ecf9a"} Apr 24 23:57:59.357177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357248 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-federate-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357248 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357353 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvjr\" (UniqueName: \"kubernetes.io/projected/3f60b52c-16eb-4774-8e73-fb65b6922ee9-kube-api-access-brvjr\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357405 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-metrics-client-ca\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357468 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-serving-certs-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357517 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.357517 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.357508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.386902 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.386849 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.386829675 podStartE2EDuration="2.386829675s" podCreationTimestamp="2026-04-24 23:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:57:59.380402433 +0000 UTC m=+240.541352075" watchObservedRunningTime="2026-04-24 23:57:59.386829675 +0000 UTC m=+240.547779317" Apr 24 23:57:59.457973 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.457933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brvjr\" (UniqueName: \"kubernetes.io/projected/3f60b52c-16eb-4774-8e73-fb65b6922ee9-kube-api-access-brvjr\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-metrics-client-ca\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-serving-certs-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458149 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458306 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458306 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458478 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-federate-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458546 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.458991 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.458962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-serving-certs-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.459113 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.459060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-metrics-client-ca\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.459308 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.459281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.460687 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.460658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.461654 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.461615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-telemeter-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.461836 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.461819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-secret-telemeter-client\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.461884 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.461831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3f60b52c-16eb-4774-8e73-fb65b6922ee9-federate-client-tls\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.466396 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.466372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brvjr\" (UniqueName: \"kubernetes.io/projected/3f60b52c-16eb-4774-8e73-fb65b6922ee9-kube-api-access-brvjr\") pod \"telemeter-client-667c8bcf6d-xjsqz\" (UID: \"3f60b52c-16eb-4774-8e73-fb65b6922ee9\") " pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.553653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.553625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" Apr 24 23:57:59.682115 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:57:59.682091 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz"] Apr 24 23:57:59.684150 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:57:59.684120 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f60b52c_16eb_4774_8e73_fb65b6922ee9.slice/crio-0201bedbb14f634ad4b9596b3f6e4256fb4406b4422fc579d52b190397685514 WatchSource:0}: Error finding container 0201bedbb14f634ad4b9596b3f6e4256fb4406b4422fc579d52b190397685514: Status 404 returned error can't find the container with id 0201bedbb14f634ad4b9596b3f6e4256fb4406b4422fc579d52b190397685514 Apr 24 23:58:00.356947 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:00.356899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" event={"ID":"3f60b52c-16eb-4774-8e73-fb65b6922ee9","Type":"ContainerStarted","Data":"0201bedbb14f634ad4b9596b3f6e4256fb4406b4422fc579d52b190397685514"} Apr 24 23:58:01.361922 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:01.361890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" event={"ID":"3f60b52c-16eb-4774-8e73-fb65b6922ee9","Type":"ContainerStarted","Data":"281391da88e3446410d906a54f017d771693c6fe6efdd6ffefc1f0d52e1adc76"} Apr 24 23:58:01.362273 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:01.361932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" event={"ID":"3f60b52c-16eb-4774-8e73-fb65b6922ee9","Type":"ContainerStarted","Data":"1e1f834b59b405b8b7f8c2bbc78c8fff102efa7a1e75fe0fe21e94b7907017d8"} Apr 24 23:58:02.366624 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:02.366587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" event={"ID":"3f60b52c-16eb-4774-8e73-fb65b6922ee9","Type":"ContainerStarted","Data":"01041e68db98cc9924ec27daf96e4d75ab2b38bf41bcb36967474d708a175d60"} Apr 24 23:58:02.408609 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:02.408557 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-667c8bcf6d-xjsqz" podStartSLOduration=1.8785080349999999 podStartE2EDuration="3.408543991s" podCreationTimestamp="2026-04-24 23:57:59 +0000 UTC" firstStartedPulling="2026-04-24 23:57:59.686018507 +0000 UTC m=+240.846968127" lastFinishedPulling="2026-04-24 23:58:01.216054464 +0000 UTC m=+242.377004083" observedRunningTime="2026-04-24 23:58:02.405372305 +0000 UTC m=+243.566321958" watchObservedRunningTime="2026-04-24 23:58:02.408543991 +0000 UTC m=+243.569493656" Apr 24 23:58:03.138194 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.138153 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:58:03.141711 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.141667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.151293 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.151241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:58:03.293105 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566t8\" (UniqueName: \"kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293105 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293290 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.293479 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.293345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.393757 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.393757 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.394154 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.394154 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.394154 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.394154 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.394154 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.393963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-566t8\" (UniqueName: \"kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.395102 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.395070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.395207 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.395074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.395399 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.395370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.395469 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.395400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.397298 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.397269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.397470 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.397446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.420119 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.420092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-566t8\" (UniqueName: \"kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8\") pod \"console-59cd4bcc77-5vstg\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.453169 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.453132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:03.569825 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:03.569791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:58:03.572778 ip-10-0-132-64 kubenswrapper[2576]: W0424 23:58:03.572749 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d6a53c_29fb_419a_b3d8_25015e067b42.slice/crio-4dc9f15b320ec13bee7fd1ff7ea4492eb223870e5a289a081b4d0551c12e9ff5 WatchSource:0}: Error finding container 4dc9f15b320ec13bee7fd1ff7ea4492eb223870e5a289a081b4d0551c12e9ff5: Status 404 returned error can't find the container with id 4dc9f15b320ec13bee7fd1ff7ea4492eb223870e5a289a081b4d0551c12e9ff5 Apr 24 23:58:04.373644 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:04.373605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cd4bcc77-5vstg" event={"ID":"36d6a53c-29fb-419a-b3d8-25015e067b42","Type":"ContainerStarted","Data":"4c475422bc5122a2dd683b0d647ac60ef4676be9966dca3232db13ed2dbfbc71"} Apr 24 23:58:04.373644 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:04.373648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cd4bcc77-5vstg" event={"ID":"36d6a53c-29fb-419a-b3d8-25015e067b42","Type":"ContainerStarted","Data":"4dc9f15b320ec13bee7fd1ff7ea4492eb223870e5a289a081b4d0551c12e9ff5"} Apr 24 23:58:04.391444 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:04.391396 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59cd4bcc77-5vstg" podStartSLOduration=1.39138185 podStartE2EDuration="1.39138185s" podCreationTimestamp="2026-04-24 23:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:58:04.389561571 +0000 UTC m=+245.550511214" watchObservedRunningTime="2026-04-24 23:58:04.39138185 +0000 UTC m=+245.552331490" Apr 24 23:58:13.453921 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:13.453832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:13.453921 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:13.453875 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:13.458467 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:13.458440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:14.407055 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:14.407028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:58:14.451497 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:14.451455 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:58:39.472201 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.472160 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f86db465c-nl8nl" podUID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" containerName="console" containerID="cri-o://b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc" gracePeriod=15 Apr 24 23:58:39.722257 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.722203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f86db465c-nl8nl_12c5df2c-e0bf-49b6-8272-0817e3902d6d/console/0.log" Apr 24 23:58:39.722346 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.722267 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:58:39.795660 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.795629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.795844 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.795685 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.795912 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.795843 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.795912 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.795888 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.796025 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796005 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7gsx\" (UniqueName: \"kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.796083 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796036 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.796138 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796120 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca\") pod \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\" (UID: \"12c5df2c-e0bf-49b6-8272-0817e3902d6d\") " Apr 24 23:58:39.796194 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796130 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config" (OuterVolumeSpecName: "console-config") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:39.796351 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796263 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:39.796421 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796392 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.796486 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796458 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:39.796534 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.796493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca" (OuterVolumeSpecName: "service-ca") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:58:39.798165 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.798131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx" (OuterVolumeSpecName: "kube-api-access-d7gsx") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "kube-api-access-d7gsx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:39.798254 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.798234 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:39.798254 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.798244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "12c5df2c-e0bf-49b6-8272-0817e3902d6d" (UID: "12c5df2c-e0bf-49b6-8272-0817e3902d6d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:39.896996 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.896948 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.896996 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.896995 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-trusted-ca-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.896996 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.897005 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7gsx\" (UniqueName: \"kubernetes.io/projected/12c5df2c-e0bf-49b6-8272-0817e3902d6d-kube-api-access-d7gsx\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.896996 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.897015 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-oauth-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.897258 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.897025 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12c5df2c-e0bf-49b6-8272-0817e3902d6d-service-ca\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:39.897258 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:39.897034 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12c5df2c-e0bf-49b6-8272-0817e3902d6d-console-oauth-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:58:40.480979 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.480944 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f86db465c-nl8nl_12c5df2c-e0bf-49b6-8272-0817e3902d6d/console/0.log" Apr 24 23:58:40.481404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.480992 2576 generic.go:358] "Generic (PLEG): container finished" podID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" containerID="b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc" exitCode=2 Apr 24 23:58:40.481404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.481060 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f86db465c-nl8nl" Apr 24 23:58:40.481404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.481057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f86db465c-nl8nl" event={"ID":"12c5df2c-e0bf-49b6-8272-0817e3902d6d","Type":"ContainerDied","Data":"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc"} Apr 24 23:58:40.481404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.481175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f86db465c-nl8nl" event={"ID":"12c5df2c-e0bf-49b6-8272-0817e3902d6d","Type":"ContainerDied","Data":"3c14ced5cd71f83b3ac04c5ee2152a3ad42e0f94d0e0fa6a7dca590881c62401"} Apr 24 23:58:40.481404 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.481198 2576 scope.go:117] "RemoveContainer" containerID="b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc" Apr 24 23:58:40.489763 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.489745 2576 scope.go:117] "RemoveContainer" containerID="b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc" Apr 24 23:58:40.490038 ip-10-0-132-64 kubenswrapper[2576]: E0424 23:58:40.490019 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc\": container with ID starting with b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc not found: ID does not exist" containerID="b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc" Apr 24 23:58:40.490106 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.490050 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc"} err="failed to get container status \"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc\": rpc error: code = NotFound desc = could not find container \"b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc\": container with ID starting with b2df585cac55e5edeec7c8d8b50b79ffa6494ab83c4edc6643ad07fd043578fc not found: ID does not exist" Apr 24 23:58:40.516144 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.516099 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:58:40.519908 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:40.519881 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f86db465c-nl8nl"] Apr 24 23:58:41.400115 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:41.400078 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" path="/var/lib/kubelet/pods/12c5df2c-e0bf-49b6-8272-0817e3902d6d/volumes" Apr 24 23:58:59.286343 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:59.286314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 24 23:58:59.286878 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:59.286853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 24 23:58:59.291393 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:59.291373 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:58:59.292109 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:59.292088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 24 23:58:59.295670 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:58:59.295653 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:59:21.454519 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:21.454485 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:59:46.474456 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.474414 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59cd4bcc77-5vstg" podUID="36d6a53c-29fb-419a-b3d8-25015e067b42" containerName="console" containerID="cri-o://4c475422bc5122a2dd683b0d647ac60ef4676be9966dca3232db13ed2dbfbc71" gracePeriod=15 Apr 24 23:59:46.675158 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.675130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59cd4bcc77-5vstg_36d6a53c-29fb-419a-b3d8-25015e067b42/console/0.log" Apr 24 23:59:46.675293 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.675184 2576 generic.go:358] "Generic (PLEG): container finished" podID="36d6a53c-29fb-419a-b3d8-25015e067b42" containerID="4c475422bc5122a2dd683b0d647ac60ef4676be9966dca3232db13ed2dbfbc71" exitCode=2 Apr 24 23:59:46.675293 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.675226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cd4bcc77-5vstg" event={"ID":"36d6a53c-29fb-419a-b3d8-25015e067b42","Type":"ContainerDied","Data":"4c475422bc5122a2dd683b0d647ac60ef4676be9966dca3232db13ed2dbfbc71"} Apr 24 23:59:46.712410 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.712389 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59cd4bcc77-5vstg_36d6a53c-29fb-419a-b3d8-25015e067b42/console/0.log" Apr 24 23:59:46.712519 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.712446 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:59:46.848983 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.848943 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.848994 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849114 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566t8\" (UniqueName: \"kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849149 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849180 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849172 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849423 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config\") pod \"36d6a53c-29fb-419a-b3d8-25015e067b42\" (UID: \"36d6a53c-29fb-419a-b3d8-25015e067b42\") " Apr 24 23:59:46.849517 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849471 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca" (OuterVolumeSpecName: "service-ca") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:46.849585 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849543 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config" (OuterVolumeSpecName: "console-config") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:46.849585 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849567 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:46.849718 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.849600 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:59:46.851575 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.851554 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:59:46.851803 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.851769 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:59:46.851803 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.851786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8" (OuterVolumeSpecName: "kube-api-access-566t8") pod "36d6a53c-29fb-419a-b3d8-25015e067b42" (UID: "36d6a53c-29fb-419a-b3d8-25015e067b42"). InnerVolumeSpecName "kube-api-access-566t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:46.950468 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950438 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-oauth-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950468 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950466 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-566t8\" (UniqueName: \"kubernetes.io/projected/36d6a53c-29fb-419a-b3d8-25015e067b42-kube-api-access-566t8\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950478 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-console-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950493 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-trusted-ca-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950507 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-oauth-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950516 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d6a53c-29fb-419a-b3d8-25015e067b42-console-serving-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:46.950653 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:46.950525 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d6a53c-29fb-419a-b3d8-25015e067b42-service-ca\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 24 23:59:47.679842 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.679764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59cd4bcc77-5vstg_36d6a53c-29fb-419a-b3d8-25015e067b42/console/0.log" Apr 24 23:59:47.680177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.679850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cd4bcc77-5vstg" event={"ID":"36d6a53c-29fb-419a-b3d8-25015e067b42","Type":"ContainerDied","Data":"4dc9f15b320ec13bee7fd1ff7ea4492eb223870e5a289a081b4d0551c12e9ff5"} Apr 24 23:59:47.680177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.679886 2576 scope.go:117] "RemoveContainer" containerID="4c475422bc5122a2dd683b0d647ac60ef4676be9966dca3232db13ed2dbfbc71" Apr 24 23:59:47.680177 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.679910 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cd4bcc77-5vstg" Apr 24 23:59:47.697204 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.697181 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:59:47.700821 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:47.700794 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59cd4bcc77-5vstg"] Apr 24 23:59:49.399613 ip-10-0-132-64 kubenswrapper[2576]: I0424 23:59:49.399582 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d6a53c-29fb-419a-b3d8-25015e067b42" path="/var/lib/kubelet/pods/36d6a53c-29fb-419a-b3d8-25015e067b42/volumes" Apr 25 00:00:00.950333 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950294 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pxtxr"] Apr 25 00:00:00.950747 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950664 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" containerName="console" Apr 25 00:00:00.950747 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950676 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" containerName="console" Apr 25 00:00:00.950747 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950687 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d6a53c-29fb-419a-b3d8-25015e067b42" containerName="console" Apr 25 00:00:00.950747 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950715 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d6a53c-29fb-419a-b3d8-25015e067b42" containerName="console" Apr 25 00:00:00.950875 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950766 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="12c5df2c-e0bf-49b6-8272-0817e3902d6d" containerName="console" Apr 25 00:00:00.950875 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.950778 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d6a53c-29fb-419a-b3d8-25015e067b42" containerName="console" Apr 25 00:00:00.955358 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.955340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:00.959135 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.959114 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 25 00:00:00.961934 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:00.961912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pxtxr"] Apr 25 00:00:01.071959 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.071925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-dbus\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.072106 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.071971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-kubelet-config\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.072106 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.072009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-original-pull-secret\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.173036 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.172993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-dbus\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.173221 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.173046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-kubelet-config\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.173221 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.173090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-original-pull-secret\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.173221 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.173186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-dbus\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.173381 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.173213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-kubelet-config\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.175575 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.175553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e751c0d7-8e62-4d09-bbcb-7987a6bb0be2-original-pull-secret\") pod \"global-pull-secret-syncer-pxtxr\" (UID: \"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2\") " pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.285451 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.285410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pxtxr" Apr 25 00:00:01.410797 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.410769 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pxtxr"] Apr 25 00:00:01.413267 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:00:01.413241 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode751c0d7_8e62_4d09_bbcb_7987a6bb0be2.slice/crio-1f51554b69da92b853b248b30dfed8bb9832b6245566abf7652c2653e03a4b91 WatchSource:0}: Error finding container 1f51554b69da92b853b248b30dfed8bb9832b6245566abf7652c2653e03a4b91: Status 404 returned error can't find the container with id 1f51554b69da92b853b248b30dfed8bb9832b6245566abf7652c2653e03a4b91 Apr 25 00:00:01.414904 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.414886 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:00:01.723819 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:01.723730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pxtxr" event={"ID":"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2","Type":"ContainerStarted","Data":"1f51554b69da92b853b248b30dfed8bb9832b6245566abf7652c2653e03a4b91"} Apr 25 00:00:06.739831 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:06.739793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pxtxr" event={"ID":"e751c0d7-8e62-4d09-bbcb-7987a6bb0be2","Type":"ContainerStarted","Data":"798c194d3782b9ab4a35edfd8a2b8a20e1c8d267356fc3bb890a19c9275e5b1a"} Apr 25 00:00:06.755351 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:06.755292 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pxtxr" podStartSLOduration=2.421506311 podStartE2EDuration="6.755277558s" podCreationTimestamp="2026-04-25 00:00:00 +0000 UTC" firstStartedPulling="2026-04-25 00:00:01.415009965 +0000 UTC m=+362.575959584" lastFinishedPulling="2026-04-25 00:00:05.74878121 +0000 UTC m=+366.909730831" observedRunningTime="2026-04-25 00:00:06.754707951 +0000 UTC m=+367.915657583" watchObservedRunningTime="2026-04-25 00:00:06.755277558 +0000 UTC m=+367.916227199" Apr 25 00:00:54.742197 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.742159 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx"] Apr 25 00:00:54.745447 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.745419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:54.747798 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.747778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 25 00:00:54.748393 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.748365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dvpxz\"" Apr 25 00:00:54.748393 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.748387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 25 00:00:54.753504 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.753483 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx"] Apr 25 00:00:54.905044 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.904999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:54.905044 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.905043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nlf2\" (UniqueName: \"kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:54.905313 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:54.905149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.006010 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.005975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.006173 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.006012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nlf2\" (UniqueName: \"kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.006173 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.006056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.006373 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.006351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.006428 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.006406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.014683 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.014657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nlf2\" (UniqueName: \"kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.055646 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.055621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:00:55.177921 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.177900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx"] Apr 25 00:00:55.180161 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:00:55.180132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1852a20_f4f1_4137_8074_e60e8a069758.slice/crio-30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c WatchSource:0}: Error finding container 30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c: Status 404 returned error can't find the container with id 30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c Apr 25 00:00:55.886143 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:00:55.886102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerStarted","Data":"30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c"} Apr 25 00:01:00.905066 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:00.905023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerStarted","Data":"6d62d471cce213bf649e835beec35ee91c39448aa6651dc7b3cd13046cf60d6c"} Apr 25 00:01:01.909072 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:01.909040 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1852a20-f4f1-4137-8074-e60e8a069758" containerID="6d62d471cce213bf649e835beec35ee91c39448aa6651dc7b3cd13046cf60d6c" exitCode=0 Apr 25 00:01:01.909522 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:01.909100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerDied","Data":"6d62d471cce213bf649e835beec35ee91c39448aa6651dc7b3cd13046cf60d6c"} Apr 25 00:01:04.919431 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:04.919341 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1852a20-f4f1-4137-8074-e60e8a069758" containerID="43617dde8b832c34f0ae98eef29afb6f0b490b0dfa5fa752a0088e55fe607719" exitCode=0 Apr 25 00:01:04.919798 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:04.919429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerDied","Data":"43617dde8b832c34f0ae98eef29afb6f0b490b0dfa5fa752a0088e55fe607719"} Apr 25 00:01:11.943162 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:11.943125 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1852a20-f4f1-4137-8074-e60e8a069758" containerID="c9ac91fe1e0d8167a4d90a55039562e2b4c08b4902dcca607f98eed74ba52ca8" exitCode=0 Apr 25 00:01:11.943535 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:11.943231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerDied","Data":"c9ac91fe1e0d8167a4d90a55039562e2b4c08b4902dcca607f98eed74ba52ca8"} Apr 25 00:01:13.072415 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.072393 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:01:13.155615 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.155585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle\") pod \"d1852a20-f4f1-4137-8074-e60e8a069758\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " Apr 25 00:01:13.155802 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.155622 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nlf2\" (UniqueName: \"kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2\") pod \"d1852a20-f4f1-4137-8074-e60e8a069758\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " Apr 25 00:01:13.155802 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.155649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util\") pod \"d1852a20-f4f1-4137-8074-e60e8a069758\" (UID: \"d1852a20-f4f1-4137-8074-e60e8a069758\") " Apr 25 00:01:13.156242 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.156212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle" (OuterVolumeSpecName: "bundle") pod "d1852a20-f4f1-4137-8074-e60e8a069758" (UID: "d1852a20-f4f1-4137-8074-e60e8a069758"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:01:13.157978 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.157959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2" (OuterVolumeSpecName: "kube-api-access-5nlf2") pod "d1852a20-f4f1-4137-8074-e60e8a069758" (UID: "d1852a20-f4f1-4137-8074-e60e8a069758"). InnerVolumeSpecName "kube-api-access-5nlf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:13.162036 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.162009 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util" (OuterVolumeSpecName: "util") pod "d1852a20-f4f1-4137-8074-e60e8a069758" (UID: "d1852a20-f4f1-4137-8074-e60e8a069758"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:01:13.256729 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.256684 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:01:13.256729 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.256732 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nlf2\" (UniqueName: \"kubernetes.io/projected/d1852a20-f4f1-4137-8074-e60e8a069758-kube-api-access-5nlf2\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:01:13.256894 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.256745 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1852a20-f4f1-4137-8074-e60e8a069758-util\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:01:13.951354 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.951325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" event={"ID":"d1852a20-f4f1-4137-8074-e60e8a069758","Type":"ContainerDied","Data":"30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c"} Apr 25 00:01:13.951354 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.951360 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f55c1198fb082f0513bc7f4d82db49ade9f9ec7ae2f0272554b56e31866c6c" Apr 25 00:01:13.951556 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:13.951330 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqw2cx" Apr 25 00:01:21.051294 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051255 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4jdt"] Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051622 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="util" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051633 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="util" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051640 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="pull" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051647 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="pull" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051665 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="extract" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051671 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="extract" Apr 25 00:01:21.051804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.051751 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1852a20-f4f1-4137-8074-e60e8a069758" containerName="extract" Apr 25 00:01:21.099820 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.099784 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4jdt"] Apr 25 00:01:21.099987 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.099956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.102892 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 25 00:01:21.102892 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 25 00:01:21.102892 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 25 00:01:21.103128 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102869 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 25 00:01:21.103128 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p5vmd\"" Apr 25 00:01:21.103128 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.102886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 25 00:01:21.123849 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.123826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.123963 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.123856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d58vc\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-kube-api-access-d58vc\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.123963 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.123900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-cabundle0\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.224540 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.224500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.224540 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.224539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d58vc\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-kube-api-access-d58vc\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.224791 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.224583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-cabundle0\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.224791 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.224632 2576 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:01:21.224791 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.224654 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:01:21.224791 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.224663 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4jdt: references non-existent secret key: ca.crt Apr 25 00:01:21.224791 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.224748 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates podName:c1c7ad31-ae8d-4873-9dfc-8f330f5ed368 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:21.724728955 +0000 UTC m=+442.885678580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates") pod "keda-operator-ffbb595cb-b4jdt" (UID: "c1c7ad31-ae8d-4873-9dfc-8f330f5ed368") : references non-existent secret key: ca.crt Apr 25 00:01:21.225165 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.225148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-cabundle0\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.235181 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.235163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d58vc\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-kube-api-access-d58vc\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.684941 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.684904 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-ns6rm"] Apr 25 00:01:21.711512 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.711470 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ns6rm"] Apr 25 00:01:21.711651 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.711600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.713897 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.713873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 25 00:01:21.728914 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.728891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn79\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-kube-api-access-qrn79\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.729020 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.728935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:21.729067 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.729002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-certificates\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.729067 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.729034 2576 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:01:21.729067 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.729046 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:01:21.729067 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.729054 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4jdt: references non-existent secret key: ca.crt Apr 25 00:01:21.729187 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:21.729092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates podName:c1c7ad31-ae8d-4873-9dfc-8f330f5ed368 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:22.729079495 +0000 UTC m=+443.890029114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates") pod "keda-operator-ffbb595cb-b4jdt" (UID: "c1c7ad31-ae8d-4873-9dfc-8f330f5ed368") : references non-existent secret key: ca.crt Apr 25 00:01:21.829735 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.829688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-certificates\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.829897 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.829773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn79\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-kube-api-access-qrn79\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.832641 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.832612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-certificates\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:21.838276 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:21.838249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn79\" (UniqueName: \"kubernetes.io/projected/69368b95-9107-4bca-a737-842fb41e3e34-kube-api-access-qrn79\") pod \"keda-admission-cf49989db-ns6rm\" (UID: \"69368b95-9107-4bca-a737-842fb41e3e34\") " pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:22.022615 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:22.022565 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:22.154116 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:22.154091 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ns6rm"] Apr 25 00:01:22.156298 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:01:22.156272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69368b95_9107_4bca_a737_842fb41e3e34.slice/crio-4bc05371b2448d670e317168eefb000d2cc83b6ea5847a90ee5ca15e3881407d WatchSource:0}: Error finding container 4bc05371b2448d670e317168eefb000d2cc83b6ea5847a90ee5ca15e3881407d: Status 404 returned error can't find the container with id 4bc05371b2448d670e317168eefb000d2cc83b6ea5847a90ee5ca15e3881407d Apr 25 00:01:22.737092 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:22.737057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:22.737247 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:22.737220 2576 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:01:22.737247 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:22.737242 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:01:22.737319 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:22.737255 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4jdt: references non-existent secret key: ca.crt Apr 25 00:01:22.737319 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:22.737312 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates podName:c1c7ad31-ae8d-4873-9dfc-8f330f5ed368 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:24.737295136 +0000 UTC m=+445.898244755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates") pod "keda-operator-ffbb595cb-b4jdt" (UID: "c1c7ad31-ae8d-4873-9dfc-8f330f5ed368") : references non-existent secret key: ca.crt Apr 25 00:01:22.981069 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:22.981020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ns6rm" event={"ID":"69368b95-9107-4bca-a737-842fb41e3e34","Type":"ContainerStarted","Data":"4bc05371b2448d670e317168eefb000d2cc83b6ea5847a90ee5ca15e3881407d"} Apr 25 00:01:23.985206 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:23.985171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ns6rm" event={"ID":"69368b95-9107-4bca-a737-842fb41e3e34","Type":"ContainerStarted","Data":"e1abccec798e24322886fbdad1778c4aafffffa8a0f03d266633db76019cf5d0"} Apr 25 00:01:23.985554 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:23.985284 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:24.002767 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:24.002724 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-ns6rm" podStartSLOduration=1.29182378 podStartE2EDuration="3.002711684s" podCreationTimestamp="2026-04-25 00:01:21 +0000 UTC" firstStartedPulling="2026-04-25 00:01:22.157653314 +0000 UTC m=+443.318602933" lastFinishedPulling="2026-04-25 00:01:23.868541204 +0000 UTC m=+445.029490837" observedRunningTime="2026-04-25 00:01:24.000445733 +0000 UTC m=+445.161395405" watchObservedRunningTime="2026-04-25 00:01:24.002711684 +0000 UTC m=+445.163661316" Apr 25 00:01:24.755377 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:24.755339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:24.755561 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:24.755452 2576 secret.go:281] references non-existent secret key: ca.crt Apr 25 00:01:24.755561 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:24.755464 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 25 00:01:24.755561 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:24.755473 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4jdt: references non-existent secret key: ca.crt Apr 25 00:01:24.755561 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:01:24.755522 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates podName:c1c7ad31-ae8d-4873-9dfc-8f330f5ed368 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:28.755508255 +0000 UTC m=+449.916457878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates") pod "keda-operator-ffbb595cb-b4jdt" (UID: "c1c7ad31-ae8d-4873-9dfc-8f330f5ed368") : references non-existent secret key: ca.crt Apr 25 00:01:28.792872 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:28.792836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:28.795351 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:28.795330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1c7ad31-ae8d-4873-9dfc-8f330f5ed368-certificates\") pod \"keda-operator-ffbb595cb-b4jdt\" (UID: \"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368\") " pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:28.911937 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:28.911883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:29.036059 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:29.035921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4jdt"] Apr 25 00:01:29.038533 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:01:29.038502 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c7ad31_ae8d_4873_9dfc_8f330f5ed368.slice/crio-8a1b6abe4c6fbd0f330f2abcb090744be901b47a0729b55743cd73caf4f2442c WatchSource:0}: Error finding container 8a1b6abe4c6fbd0f330f2abcb090744be901b47a0729b55743cd73caf4f2442c: Status 404 returned error can't find the container with id 8a1b6abe4c6fbd0f330f2abcb090744be901b47a0729b55743cd73caf4f2442c Apr 25 00:01:30.005097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:30.005055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" event={"ID":"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368","Type":"ContainerStarted","Data":"8a1b6abe4c6fbd0f330f2abcb090744be901b47a0729b55743cd73caf4f2442c"} Apr 25 00:01:33.020027 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:33.019984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" event={"ID":"c1c7ad31-ae8d-4873-9dfc-8f330f5ed368","Type":"ContainerStarted","Data":"2555e9039bff1b8993c0f6b58ad193cdd3505878d3b94e43d7b5f64190a08808"} Apr 25 00:01:33.020440 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:33.020096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:01:33.038776 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:33.038727 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" podStartSLOduration=8.876027828 podStartE2EDuration="12.038680767s" podCreationTimestamp="2026-04-25 00:01:21 +0000 UTC" firstStartedPulling="2026-04-25 00:01:29.03991861 +0000 UTC m=+450.200868230" lastFinishedPulling="2026-04-25 00:01:32.202571547 +0000 UTC m=+453.363521169" observedRunningTime="2026-04-25 00:01:33.037941546 +0000 UTC m=+454.198891188" watchObservedRunningTime="2026-04-25 00:01:33.038680767 +0000 UTC m=+454.199630411" Apr 25 00:01:44.990981 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:44.990948 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-ns6rm" Apr 25 00:01:54.026147 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:01:54.026109 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-b4jdt" Apr 25 00:02:27.568032 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.567995 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-st7fs"] Apr 25 00:02:27.570670 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.570652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.572937 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.572912 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 25 00:02:27.573059 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.572917 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:02:27.573059 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.572920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mgtpf\"" Apr 25 00:02:27.573616 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.573598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:02:27.579304 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.579280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-st7fs"] Apr 25 00:02:27.661948 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.661923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1c270bee-5bf3-45f2-9b85-e043a21fcca6-data\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.662082 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.661967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rh2k\" (UniqueName: \"kubernetes.io/projected/1c270bee-5bf3-45f2-9b85-e043a21fcca6-kube-api-access-8rh2k\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.762626 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.762579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rh2k\" (UniqueName: \"kubernetes.io/projected/1c270bee-5bf3-45f2-9b85-e043a21fcca6-kube-api-access-8rh2k\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.762807 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.762672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1c270bee-5bf3-45f2-9b85-e043a21fcca6-data\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.763013 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.762992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1c270bee-5bf3-45f2-9b85-e043a21fcca6-data\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.772099 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.772072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rh2k\" (UniqueName: \"kubernetes.io/projected/1c270bee-5bf3-45f2-9b85-e043a21fcca6-kube-api-access-8rh2k\") pod \"seaweedfs-86cc847c5c-st7fs\" (UID: \"1c270bee-5bf3-45f2-9b85-e043a21fcca6\") " pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:27.880931 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:27.880842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:28.010105 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:02:28.010074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c270bee_5bf3_45f2_9b85_e043a21fcca6.slice/crio-41bf096a55b6532502679026eb4b82a72e5e7421a2f229e3cb94d135519f5287 WatchSource:0}: Error finding container 41bf096a55b6532502679026eb4b82a72e5e7421a2f229e3cb94d135519f5287: Status 404 returned error can't find the container with id 41bf096a55b6532502679026eb4b82a72e5e7421a2f229e3cb94d135519f5287 Apr 25 00:02:28.013297 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:28.013270 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-st7fs"] Apr 25 00:02:28.206240 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:28.206148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-st7fs" event={"ID":"1c270bee-5bf3-45f2-9b85-e043a21fcca6","Type":"ContainerStarted","Data":"41bf096a55b6532502679026eb4b82a72e5e7421a2f229e3cb94d135519f5287"} Apr 25 00:02:31.219262 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:31.219224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-st7fs" event={"ID":"1c270bee-5bf3-45f2-9b85-e043a21fcca6","Type":"ContainerStarted","Data":"1eb0e7435e1c786f3116eb9d8e6d2eac28d4a35f1c4e5bca5d1698450415b285"} Apr 25 00:02:31.219629 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:31.219304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:02:31.236798 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:31.236755 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-st7fs" podStartSLOduration=1.855581481 podStartE2EDuration="4.236744362s" podCreationTimestamp="2026-04-25 00:02:27 +0000 UTC" firstStartedPulling="2026-04-25 00:02:28.011283225 +0000 UTC m=+509.172232854" lastFinishedPulling="2026-04-25 00:02:30.392446116 +0000 UTC m=+511.553395735" observedRunningTime="2026-04-25 00:02:31.234810243 +0000 UTC m=+512.395759887" watchObservedRunningTime="2026-04-25 00:02:31.236744362 +0000 UTC m=+512.397694002" Apr 25 00:02:37.225456 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:02:37.225364 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-st7fs" Apr 25 00:03:04.529371 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.529334 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cb59d7578-sk9dg"] Apr 25 00:03:04.533305 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.533271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.537354 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.537323 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 25 00:03:04.538166 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 25 00:03:04.538166 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 25 00:03:04.538382 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2sr25\"" Apr 25 00:03:04.538382 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538251 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 25 00:03:04.538382 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 25 00:03:04.538514 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538460 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 25 00:03:04.538992 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.538962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 25 00:03:04.544541 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.544518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb59d7578-sk9dg"] Apr 25 00:03:04.544743 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.544720 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 25 00:03:04.673306 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdbp\" (UniqueName: \"kubernetes.io/projected/ec5ce6b6-b2f8-4931-972c-963fb1274140-kube-api-access-2sdbp\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-oauth-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-oauth-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-service-ca\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.673499 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.673485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-trusted-ca-bundle\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.773913 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.773865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdbp\" (UniqueName: \"kubernetes.io/projected/ec5ce6b6-b2f8-4931-972c-963fb1274140-kube-api-access-2sdbp\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.773940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-oauth-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.773972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.773994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-oauth-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-service-ca\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-trusted-ca-bundle\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774317 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774830 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-service-ca\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774941 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.774941 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-oauth-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.775011 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.774942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec5ce6b6-b2f8-4931-972c-963fb1274140-trusted-ca-bundle\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.777102 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.777084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-serving-cert\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.777163 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.777110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec5ce6b6-b2f8-4931-972c-963fb1274140-console-oauth-config\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.790192 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.790138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdbp\" (UniqueName: \"kubernetes.io/projected/ec5ce6b6-b2f8-4931-972c-963fb1274140-kube-api-access-2sdbp\") pod \"console-cb59d7578-sk9dg\" (UID: \"ec5ce6b6-b2f8-4931-972c-963fb1274140\") " pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.845869 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.845831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:04.979827 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:04.979773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb59d7578-sk9dg"] Apr 25 00:03:04.981560 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:03:04.981533 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5ce6b6_b2f8_4931_972c_963fb1274140.slice/crio-ec181292f53d6cc34ec02b6118f8dcf3b546b4e438eb6fbe140b03b87402666f WatchSource:0}: Error finding container ec181292f53d6cc34ec02b6118f8dcf3b546b4e438eb6fbe140b03b87402666f: Status 404 returned error can't find the container with id ec181292f53d6cc34ec02b6118f8dcf3b546b4e438eb6fbe140b03b87402666f Apr 25 00:03:05.336820 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:05.336781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb59d7578-sk9dg" event={"ID":"ec5ce6b6-b2f8-4931-972c-963fb1274140","Type":"ContainerStarted","Data":"f1103cf97eb9d18df0da9bee4ce98ae2e116b6ba90f0134b0f207e63c94bea91"} Apr 25 00:03:05.336820 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:05.336820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb59d7578-sk9dg" event={"ID":"ec5ce6b6-b2f8-4931-972c-963fb1274140","Type":"ContainerStarted","Data":"ec181292f53d6cc34ec02b6118f8dcf3b546b4e438eb6fbe140b03b87402666f"} Apr 25 00:03:05.355625 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:05.355550 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cb59d7578-sk9dg" podStartSLOduration=1.355523096 podStartE2EDuration="1.355523096s" podCreationTimestamp="2026-04-25 00:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:03:05.35306058 +0000 UTC m=+546.514010215" watchObservedRunningTime="2026-04-25 00:03:05.355523096 +0000 UTC m=+546.516472741" Apr 25 00:03:14.846898 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:14.846853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:14.847292 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:14.846909 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:14.851805 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:14.851778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:15.374982 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:15.374955 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cb59d7578-sk9dg" Apr 25 00:03:37.871453 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.871417 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-55x25"] Apr 25 00:03:37.874994 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.874976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:37.877370 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.877346 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 25 00:03:37.877530 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.877509 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-kf7tf\"" Apr 25 00:03:37.889156 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.889136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-55x25"] Apr 25 00:03:37.909674 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.909651 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-5rhzw"] Apr 25 00:03:37.912991 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.912971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:37.915970 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.915948 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-jmpt5\"" Apr 25 00:03:37.916067 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.916022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 25 00:03:37.925138 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.925118 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5rhzw"] Apr 25 00:03:37.950023 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.949998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4b5h\" (UniqueName: \"kubernetes.io/projected/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-kube-api-access-v4b5h\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:37.950127 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.950034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:37.950127 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.950058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dj7h\" (UniqueName: \"kubernetes.io/projected/9026da9a-5b30-4323-9828-0c7956b74eff-kube-api-access-6dj7h\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:37.950220 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:37.950159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-cert\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.051531 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.051501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-cert\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.051689 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.051563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4b5h\" (UniqueName: \"kubernetes.io/projected/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-kube-api-access-v4b5h\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.051689 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.051591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.051689 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.051622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dj7h\" (UniqueName: \"kubernetes.io/projected/9026da9a-5b30-4323-9828-0c7956b74eff-kube-api-access-6dj7h\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.051881 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:03:38.051736 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 25 00:03:38.051881 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:03:38.051824 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs podName:9026da9a-5b30-4323-9828-0c7956b74eff nodeName:}" failed. No retries permitted until 2026-04-25 00:03:38.55180209 +0000 UTC m=+579.712751712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs") pod "model-serving-api-86f7b4b499-55x25" (UID: "9026da9a-5b30-4323-9828-0c7956b74eff") : secret "model-serving-api-tls" not found Apr 25 00:03:38.054097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.054080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-cert\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.062597 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.062572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4b5h\" (UniqueName: \"kubernetes.io/projected/815ed346-ab88-4b9d-9f6b-a56172cc8a9d-kube-api-access-v4b5h\") pod \"odh-model-controller-696fc77849-5rhzw\" (UID: \"815ed346-ab88-4b9d-9f6b-a56172cc8a9d\") " pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.063186 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.063162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dj7h\" (UniqueName: \"kubernetes.io/projected/9026da9a-5b30-4323-9828-0c7956b74eff-kube-api-access-6dj7h\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.224183 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.224092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:38.350230 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.350204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5rhzw"] Apr 25 00:03:38.352277 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:03:38.352247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815ed346_ab88_4b9d_9f6b_a56172cc8a9d.slice/crio-ec1835eb17f9dbccfab203eedba37f1e46076a0d6208fe73bccbc5c37ab39359 WatchSource:0}: Error finding container ec1835eb17f9dbccfab203eedba37f1e46076a0d6208fe73bccbc5c37ab39359: Status 404 returned error can't find the container with id ec1835eb17f9dbccfab203eedba37f1e46076a0d6208fe73bccbc5c37ab39359 Apr 25 00:03:38.447199 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.447166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5rhzw" event={"ID":"815ed346-ab88-4b9d-9f6b-a56172cc8a9d","Type":"ContainerStarted","Data":"ec1835eb17f9dbccfab203eedba37f1e46076a0d6208fe73bccbc5c37ab39359"} Apr 25 00:03:38.556086 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.556046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.558610 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.558586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9026da9a-5b30-4323-9828-0c7956b74eff-tls-certs\") pod \"model-serving-api-86f7b4b499-55x25\" (UID: \"9026da9a-5b30-4323-9828-0c7956b74eff\") " pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.785859 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.785807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:38.921322 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:38.921285 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-55x25"] Apr 25 00:03:38.925190 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:03:38.925158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9026da9a_5b30_4323_9828_0c7956b74eff.slice/crio-d05d8a815e7dbc0c69321fcf8a1fd5e9945af02c072c91011416ad395abd18bc WatchSource:0}: Error finding container d05d8a815e7dbc0c69321fcf8a1fd5e9945af02c072c91011416ad395abd18bc: Status 404 returned error can't find the container with id d05d8a815e7dbc0c69321fcf8a1fd5e9945af02c072c91011416ad395abd18bc Apr 25 00:03:39.454719 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:39.454646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-55x25" event={"ID":"9026da9a-5b30-4323-9828-0c7956b74eff","Type":"ContainerStarted","Data":"d05d8a815e7dbc0c69321fcf8a1fd5e9945af02c072c91011416ad395abd18bc"} Apr 25 00:03:42.467716 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.467599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-55x25" event={"ID":"9026da9a-5b30-4323-9828-0c7956b74eff","Type":"ContainerStarted","Data":"fc9c9ae0abbc122533d87ced390760e18e5aeb9f35fedf522dc9c469442128ae"} Apr 25 00:03:42.467716 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.467688 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:42.468972 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.468951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5rhzw" event={"ID":"815ed346-ab88-4b9d-9f6b-a56172cc8a9d","Type":"ContainerStarted","Data":"48d9dd92db9daab9ca5e424229e0c3917a72d47cb0cea7d23a1fe89c5655677d"} Apr 25 00:03:42.469077 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.469061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:42.485707 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.485652 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-55x25" podStartSLOduration=2.315091004 podStartE2EDuration="5.485639719s" podCreationTimestamp="2026-04-25 00:03:37 +0000 UTC" firstStartedPulling="2026-04-25 00:03:38.927966459 +0000 UTC m=+580.088916088" lastFinishedPulling="2026-04-25 00:03:42.098515185 +0000 UTC m=+583.259464803" observedRunningTime="2026-04-25 00:03:42.483192849 +0000 UTC m=+583.644142490" watchObservedRunningTime="2026-04-25 00:03:42.485639719 +0000 UTC m=+583.646589359" Apr 25 00:03:42.496942 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:42.496894 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-5rhzw" podStartSLOduration=1.807631473 podStartE2EDuration="5.496880889s" podCreationTimestamp="2026-04-25 00:03:37 +0000 UTC" firstStartedPulling="2026-04-25 00:03:38.353617224 +0000 UTC m=+579.514566848" lastFinishedPulling="2026-04-25 00:03:42.042866632 +0000 UTC m=+583.203816264" observedRunningTime="2026-04-25 00:03:42.496115299 +0000 UTC m=+583.657064939" watchObservedRunningTime="2026-04-25 00:03:42.496880889 +0000 UTC m=+583.657830529" Apr 25 00:03:53.474353 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:53.474321 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-5rhzw" Apr 25 00:03:53.476564 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:53.476546 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-55x25" Apr 25 00:03:54.273252 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.273218 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-xh7xg"] Apr 25 00:03:54.279568 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.279543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xh7xg" Apr 25 00:03:54.282514 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.282484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-xh7xg"] Apr 25 00:03:54.390261 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.390225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgc64\" (UniqueName: \"kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64\") pod \"s3-init-xh7xg\" (UID: \"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2\") " pod="kserve/s3-init-xh7xg" Apr 25 00:03:54.491540 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.491499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgc64\" (UniqueName: \"kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64\") pod \"s3-init-xh7xg\" (UID: \"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2\") " pod="kserve/s3-init-xh7xg" Apr 25 00:03:54.499493 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.499465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgc64\" (UniqueName: \"kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64\") pod \"s3-init-xh7xg\" (UID: \"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2\") " pod="kserve/s3-init-xh7xg" Apr 25 00:03:54.603471 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.603374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xh7xg" Apr 25 00:03:54.735589 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:54.735464 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-xh7xg"] Apr 25 00:03:54.738447 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:03:54.738417 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae7c771_1c98_4e8d_bbbc_3c579d314ba2.slice/crio-e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0 WatchSource:0}: Error finding container e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0: Status 404 returned error can't find the container with id e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0 Apr 25 00:03:55.519416 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:55.519370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xh7xg" event={"ID":"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2","Type":"ContainerStarted","Data":"e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0"} Apr 25 00:03:59.317467 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.317443 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:03:59.317898 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.317448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:03:59.322331 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.322311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:03:59.322423 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.322360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:03:59.535643 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.535609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xh7xg" event={"ID":"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2","Type":"ContainerStarted","Data":"a4b435936ee6a32762f344c820e54a82dbf62addbe4cfa521259af211f48ba46"} Apr 25 00:03:59.552496 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:03:59.552446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-xh7xg" podStartSLOduration=1.066075741 podStartE2EDuration="5.552432296s" podCreationTimestamp="2026-04-25 00:03:54 +0000 UTC" firstStartedPulling="2026-04-25 00:03:54.740116163 +0000 UTC m=+595.901065783" lastFinishedPulling="2026-04-25 00:03:59.226472715 +0000 UTC m=+600.387422338" observedRunningTime="2026-04-25 00:03:59.550873685 +0000 UTC m=+600.711823326" watchObservedRunningTime="2026-04-25 00:03:59.552432296 +0000 UTC m=+600.713381937" Apr 25 00:04:02.549312 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:02.549282 2576 generic.go:358] "Generic (PLEG): container finished" podID="2ae7c771-1c98-4e8d-bbbc-3c579d314ba2" containerID="a4b435936ee6a32762f344c820e54a82dbf62addbe4cfa521259af211f48ba46" exitCode=0 Apr 25 00:04:02.549631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:02.549358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xh7xg" event={"ID":"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2","Type":"ContainerDied","Data":"a4b435936ee6a32762f344c820e54a82dbf62addbe4cfa521259af211f48ba46"} Apr 25 00:04:03.692156 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:03.692129 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xh7xg" Apr 25 00:04:03.776105 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:03.776065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgc64\" (UniqueName: \"kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64\") pod \"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2\" (UID: \"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2\") " Apr 25 00:04:03.778462 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:03.778421 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64" (OuterVolumeSpecName: "kube-api-access-wgc64") pod "2ae7c771-1c98-4e8d-bbbc-3c579d314ba2" (UID: "2ae7c771-1c98-4e8d-bbbc-3c579d314ba2"). InnerVolumeSpecName "kube-api-access-wgc64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:04:03.877176 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:03.877076 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wgc64\" (UniqueName: \"kubernetes.io/projected/2ae7c771-1c98-4e8d-bbbc-3c579d314ba2-kube-api-access-wgc64\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:04:04.557040 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:04.557009 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-xh7xg" Apr 25 00:04:04.557245 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:04.557038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-xh7xg" event={"ID":"2ae7c771-1c98-4e8d-bbbc-3c579d314ba2","Type":"ContainerDied","Data":"e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0"} Apr 25 00:04:04.557245 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:04.557068 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e6789cf07754d06bb196d672030672a162692c658bb6acf3f489b69f3ffed0" Apr 25 00:04:13.550344 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.550258 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:04:13.550854 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.550629 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ae7c771-1c98-4e8d-bbbc-3c579d314ba2" containerName="s3-init" Apr 25 00:04:13.550854 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.550638 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae7c771-1c98-4e8d-bbbc-3c579d314ba2" containerName="s3-init" Apr 25 00:04:13.550854 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.550728 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ae7c771-1c98-4e8d-bbbc-3c579d314ba2" containerName="s3-init" Apr 25 00:04:13.554190 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.554172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.556465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.556440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:04:13.556589 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.556537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gz2jj\"" Apr 25 00:04:13.556589 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.556557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-8fb8f-predictor-serving-cert\"" Apr 25 00:04:13.556589 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.556543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\"" Apr 25 00:04:13.557058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.557043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:04:13.560949 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.560926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.561060 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.560962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.561060 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.561014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.561060 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.561037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnk6k\" (UniqueName: \"kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.565152 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.565124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:04:13.661578 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.661541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.661578 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.661579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.661798 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.661626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.661798 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.661653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnk6k\" (UniqueName: \"kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.661978 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.661958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.662264 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.662244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.664133 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.664112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.669110 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.669088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnk6k\" (UniqueName: \"kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k\") pod \"isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.865649 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.865577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:13.996530 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:13.996501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:04:13.998738 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:04:13.998681 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce959c1_497c_4e87_8e24_4b5ef6e69338.slice/crio-94c151c840e2ea42eedf3149a018068e52eade26b4265649f855ddf7f033f146 WatchSource:0}: Error finding container 94c151c840e2ea42eedf3149a018068e52eade26b4265649f855ddf7f033f146: Status 404 returned error can't find the container with id 94c151c840e2ea42eedf3149a018068e52eade26b4265649f855ddf7f033f146 Apr 25 00:04:14.592281 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:14.592242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerStarted","Data":"94c151c840e2ea42eedf3149a018068e52eade26b4265649f855ddf7f033f146"} Apr 25 00:04:17.604406 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:17.604363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerStarted","Data":"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18"} Apr 25 00:04:21.619277 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:21.619233 2576 generic.go:358] "Generic (PLEG): container finished" podID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerID="243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18" exitCode=0 Apr 25 00:04:21.621808 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:21.619311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerDied","Data":"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18"} Apr 25 00:04:35.681263 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:35.681220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerStarted","Data":"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4"} Apr 25 00:04:38.696047 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:38.696006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerStarted","Data":"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8"} Apr 25 00:04:40.705769 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:40.705726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerStarted","Data":"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45"} Apr 25 00:04:40.706135 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:40.705976 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:40.706135 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:40.706081 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:40.707581 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:40.707554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:04:40.730392 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:40.730333 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podStartSLOduration=1.244939595 podStartE2EDuration="27.730319331s" podCreationTimestamp="2026-04-25 00:04:13 +0000 UTC" firstStartedPulling="2026-04-25 00:04:14.00038628 +0000 UTC m=+615.161335912" lastFinishedPulling="2026-04-25 00:04:40.485766025 +0000 UTC m=+641.646715648" observedRunningTime="2026-04-25 00:04:40.728584664 +0000 UTC m=+641.889534306" watchObservedRunningTime="2026-04-25 00:04:40.730319331 +0000 UTC m=+641.891269001" Apr 25 00:04:41.709409 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:41.709377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:41.709888 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:41.709555 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:04:41.710493 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:41.710472 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:04:42.713733 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:42.713645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:04:42.714395 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:42.714366 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:04:42.717745 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:42.717716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:04:43.717101 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:43.717057 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:04:43.717555 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:43.717525 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:04:53.717498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:53.717444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:04:53.718124 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:04:53.717887 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:03.720718 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:03.720653 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:05:03.721169 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:03.721144 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:13.717879 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:13.717818 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:05:13.718372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:13.718348 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:23.717297 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:23.717248 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:05:23.717816 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:23.717731 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:33.717620 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:33.717567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:05:33.718163 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:33.717972 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:43.717891 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:43.717853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:05:43.718377 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:43.718340 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:05:58.600054 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.600021 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:05:58.600490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.600339 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" containerID="cri-o://c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4" gracePeriod=30 Apr 25 00:05:58.600490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.600384 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" containerID="cri-o://8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45" gracePeriod=30 Apr 25 00:05:58.600724 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.600395 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" containerID="cri-o://7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8" gracePeriod=30 Apr 25 00:05:58.691119 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.691080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:05:58.695608 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.695584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.697667 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.697641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-3b499-predictor-serving-cert\"" Apr 25 00:05:58.697818 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.697655 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\"" Apr 25 00:05:58.701056 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.701031 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:05:58.875890 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.875801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.875890 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.875869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.875890 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.875890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkp2\" (UniqueName: \"kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.876154 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.875924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.976344 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.976311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.976611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.976380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.976611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.976451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.976611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.976480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkp2\" (UniqueName: \"kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.976822 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.976710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.977238 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.977204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.977560 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.977531 2576 generic.go:358] "Generic (PLEG): container finished" podID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerID="7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8" exitCode=2 Apr 25 00:05:58.977716 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.977565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerDied","Data":"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8"} Apr 25 00:05:58.979138 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.979118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:58.983668 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:58.983649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkp2\" (UniqueName: \"kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2\") pod \"isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:59.009525 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:59.009502 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:05:59.137918 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:59.137889 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:05:59.140086 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:05:59.140057 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051a1f66_e6a7_47a9_8eab_ec1836b062fa.slice/crio-018b1883d2d7ab5ff9902afecfc9aa9edbdd455a5ebc0b3c68fbeb67d8f93e11 WatchSource:0}: Error finding container 018b1883d2d7ab5ff9902afecfc9aa9edbdd455a5ebc0b3c68fbeb67d8f93e11: Status 404 returned error can't find the container with id 018b1883d2d7ab5ff9902afecfc9aa9edbdd455a5ebc0b3c68fbeb67d8f93e11 Apr 25 00:05:59.141929 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:59.141912 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:05:59.983022 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:59.982985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerStarted","Data":"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27"} Apr 25 00:05:59.983022 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:05:59.983025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerStarted","Data":"018b1883d2d7ab5ff9902afecfc9aa9edbdd455a5ebc0b3c68fbeb67d8f93e11"} Apr 25 00:06:02.714299 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:02.714256 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:02.998323 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:02.998293 2576 generic.go:358] "Generic (PLEG): container finished" podID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerID="c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4" exitCode=0 Apr 25 00:06:02.998479 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:02.998331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerDied","Data":"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4"} Apr 25 00:06:03.717000 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:03.716957 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:06:03.719665 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:03.719637 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:04.004176 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:04.004140 2576 generic.go:358] "Generic (PLEG): container finished" podID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerID="27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27" exitCode=0 Apr 25 00:06:04.004343 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:04.004210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerDied","Data":"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27"} Apr 25 00:06:05.010066 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.010035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerStarted","Data":"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606"} Apr 25 00:06:05.010490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.010075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerStarted","Data":"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f"} Apr 25 00:06:05.010490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.010349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:06:05.010490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.010473 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:06:05.011827 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.011801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:05.029558 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:05.029510 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podStartSLOduration=7.029492602 podStartE2EDuration="7.029492602s" podCreationTimestamp="2026-04-25 00:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:06:05.026755866 +0000 UTC m=+726.187705507" watchObservedRunningTime="2026-04-25 00:06:05.029492602 +0000 UTC m=+726.190442245" Apr 25 00:06:06.013714 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:06.013651 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:07.714113 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:07.714069 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:11.018672 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:11.018641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:06:11.019365 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:11.019334 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:12.714386 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:12.714331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:12.714870 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:12.714524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:06:13.717112 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:13.717060 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:06:13.718900 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:13.718871 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:17.714072 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:17.714016 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:21.019511 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:21.019457 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:22.714382 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:22.714332 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:23.717517 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:23.717476 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:06:23.717985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:23.717656 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:06:23.719117 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:23.719087 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:23.719233 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:23.719218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:06:27.714241 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:27.714198 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:06:28.745440 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.745418 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:06:28.828333 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828297 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\") pod \"fce959c1-497c-4e87-8e24-4b5ef6e69338\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " Apr 25 00:06:28.828518 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location\") pod \"fce959c1-497c-4e87-8e24-4b5ef6e69338\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " Apr 25 00:06:28.828518 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828380 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls\") pod \"fce959c1-497c-4e87-8e24-4b5ef6e69338\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " Apr 25 00:06:28.828518 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnk6k\" (UniqueName: \"kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k\") pod \"fce959c1-497c-4e87-8e24-4b5ef6e69338\" (UID: \"fce959c1-497c-4e87-8e24-4b5ef6e69338\") " Apr 25 00:06:28.828688 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fce959c1-497c-4e87-8e24-4b5ef6e69338" (UID: "fce959c1-497c-4e87-8e24-4b5ef6e69338"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:06:28.828794 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.828758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config") pod "fce959c1-497c-4e87-8e24-4b5ef6e69338" (UID: "fce959c1-497c-4e87-8e24-4b5ef6e69338"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:06:28.830820 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.830797 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fce959c1-497c-4e87-8e24-4b5ef6e69338" (UID: "fce959c1-497c-4e87-8e24-4b5ef6e69338"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:06:28.830924 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.830855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k" (OuterVolumeSpecName: "kube-api-access-bnk6k") pod "fce959c1-497c-4e87-8e24-4b5ef6e69338" (UID: "fce959c1-497c-4e87-8e24-4b5ef6e69338"). InnerVolumeSpecName "kube-api-access-bnk6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:06:28.930079 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.929986 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fce959c1-497c-4e87-8e24-4b5ef6e69338-isvc-raw-sklearn-batcher-8fb8f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:06:28.930079 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.930027 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce959c1-497c-4e87-8e24-4b5ef6e69338-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:06:28.930079 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.930037 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fce959c1-497c-4e87-8e24-4b5ef6e69338-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:06:28.930079 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:28.930047 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnk6k\" (UniqueName: \"kubernetes.io/projected/fce959c1-497c-4e87-8e24-4b5ef6e69338-kube-api-access-bnk6k\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:06:29.098269 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.098229 2576 generic.go:358] "Generic (PLEG): container finished" podID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerID="8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45" exitCode=0 Apr 25 00:06:29.098426 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.098287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerDied","Data":"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45"} Apr 25 00:06:29.098426 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.098315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" event={"ID":"fce959c1-497c-4e87-8e24-4b5ef6e69338","Type":"ContainerDied","Data":"94c151c840e2ea42eedf3149a018068e52eade26b4265649f855ddf7f033f146"} Apr 25 00:06:29.098426 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.098314 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99" Apr 25 00:06:29.098426 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.098380 2576 scope.go:117] "RemoveContainer" containerID="8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45" Apr 25 00:06:29.107615 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.107597 2576 scope.go:117] "RemoveContainer" containerID="7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8" Apr 25 00:06:29.116944 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.116927 2576 scope.go:117] "RemoveContainer" containerID="c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4" Apr 25 00:06:29.121644 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.121622 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:06:29.125983 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.125958 2576 scope.go:117] "RemoveContainer" containerID="243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18" Apr 25 00:06:29.126276 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.126252 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8fb8f-predictor-59f6ff8c9-44d99"] Apr 25 00:06:29.133329 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.133310 2576 scope.go:117] "RemoveContainer" containerID="8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45" Apr 25 00:06:29.133584 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:06:29.133564 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45\": container with ID starting with 8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45 not found: ID does not exist" containerID="8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45" Apr 25 00:06:29.133653 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.133596 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45"} err="failed to get container status \"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45\": rpc error: code = NotFound desc = could not find container \"8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45\": container with ID starting with 8cf852e62a927eb4d37e65699558189d9d9257d91f8c379df9a3473bce119a45 not found: ID does not exist" Apr 25 00:06:29.133653 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.133620 2576 scope.go:117] "RemoveContainer" containerID="7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8" Apr 25 00:06:29.133879 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:06:29.133859 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8\": container with ID starting with 7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8 not found: ID does not exist" containerID="7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8" Apr 25 00:06:29.133923 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.133886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8"} err="failed to get container status \"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8\": rpc error: code = NotFound desc = could not find container \"7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8\": container with ID starting with 7f728504e9846a6f92b787f8503dff78f1369267c646fc862cede48bf09fe2a8 not found: ID does not exist" Apr 25 00:06:29.133923 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.133902 2576 scope.go:117] "RemoveContainer" containerID="c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4" Apr 25 00:06:29.134128 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:06:29.134114 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4\": container with ID starting with c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4 not found: ID does not exist" containerID="c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4" Apr 25 00:06:29.134172 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.134131 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4"} err="failed to get container status \"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4\": rpc error: code = NotFound desc = could not find container \"c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4\": container with ID starting with c13205dc5814210654368514c2b2fce2689726b40570ddf80bf2d4074256ebc4 not found: ID does not exist" Apr 25 00:06:29.134172 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.134143 2576 scope.go:117] "RemoveContainer" containerID="243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18" Apr 25 00:06:29.134330 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:06:29.134311 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18\": container with ID starting with 243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18 not found: ID does not exist" containerID="243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18" Apr 25 00:06:29.134381 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.134339 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18"} err="failed to get container status \"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18\": rpc error: code = NotFound desc = could not find container \"243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18\": container with ID starting with 243dc3a3c82a31f1794720fcbf3fed7b77593d0698fb8af053f3549cd8ae9d18 not found: ID does not exist" Apr 25 00:06:29.400052 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:29.400017 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" path="/var/lib/kubelet/pods/fce959c1-497c-4e87-8e24-4b5ef6e69338/volumes" Apr 25 00:06:31.019831 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:31.019790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:41.019670 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:41.019619 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:06:51.019406 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:06:51.019349 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:07:01.019971 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:01.019927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:07:11.019889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:11.019813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:07:38.732529 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732493 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732893 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732905 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732921 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732939 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732946 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732952 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="storage-initializer" Apr 25 00:07:38.732989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.732958 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="storage-initializer" Apr 25 00:07:38.733225 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.733008 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="agent" Apr 25 00:07:38.733225 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.733016 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kserve-container" Apr 25 00:07:38.733225 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.733025 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fce959c1-497c-4e87-8e24-4b5ef6e69338" containerName="kube-rbac-proxy" Apr 25 00:07:38.736226 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.736210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.738136 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.738113 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-3b499-serving-cert\"" Apr 25 00:07:38.738262 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.738117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-3b499-kube-rbac-proxy-sar-config\"" Apr 25 00:07:38.743821 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.743591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:07:38.835075 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.835036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.835274 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.835087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.935895 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.935853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.936067 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.935915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.936527 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.936483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:38.938524 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:38.938503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls\") pod \"model-chainer-raw-3b499-57f97cb5b9-zc5fk\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:39.047293 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:39.047274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:39.173028 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:39.173006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:07:39.175643 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:07:39.175613 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb WatchSource:0}: Error finding container 0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb: Status 404 returned error can't find the container with id 0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb Apr 25 00:07:39.354524 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:39.354437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" event={"ID":"98cd8436-3894-46a1-8eff-e943920e9581","Type":"ContainerStarted","Data":"0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb"} Apr 25 00:07:42.365672 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:42.365633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" event={"ID":"98cd8436-3894-46a1-8eff-e943920e9581","Type":"ContainerStarted","Data":"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857"} Apr 25 00:07:42.366076 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:42.365740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:42.380568 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:42.380521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podStartSLOduration=2.044492121 podStartE2EDuration="4.380508137s" podCreationTimestamp="2026-04-25 00:07:38 +0000 UTC" firstStartedPulling="2026-04-25 00:07:39.177875695 +0000 UTC m=+820.338825313" lastFinishedPulling="2026-04-25 00:07:41.5138917 +0000 UTC m=+822.674841329" observedRunningTime="2026-04-25 00:07:42.379583 +0000 UTC m=+823.540532642" watchObservedRunningTime="2026-04-25 00:07:42.380508137 +0000 UTC m=+823.541457779" Apr 25 00:07:48.375306 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.375269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:07:48.802804 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.802772 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:07:48.803078 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.803028 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" containerID="cri-o://8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857" gracePeriod=30 Apr 25 00:07:48.950437 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.950400 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:07:48.950848 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.950800 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" containerID="cri-o://b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f" gracePeriod=30 Apr 25 00:07:48.950848 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.950836 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kube-rbac-proxy" containerID="cri-o://b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606" gracePeriod=30 Apr 25 00:07:48.997237 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:48.997212 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:07:49.001081 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.001063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.003270 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.003247 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\"" Apr 25 00:07:49.003384 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.003303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-serving-cert\"" Apr 25 00:07:49.010517 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.010468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:07:49.021009 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.020983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.021126 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.021024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brf2w\" (UniqueName: \"kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.021126 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.021071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.021252 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.021193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.121909 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.121880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.122068 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.121961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.122068 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.122046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.122225 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.122069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brf2w\" (UniqueName: \"kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.122225 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:07:49.122178 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-3f0c4-predictor-serving-cert" not found Apr 25 00:07:49.122328 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:07:49.122276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls podName:94574d49-dd0d-4611-b527-a47d1fe2edca nodeName:}" failed. No retries permitted until 2026-04-25 00:07:49.622254668 +0000 UTC m=+830.783204287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" (UID: "94574d49-dd0d-4611-b527-a47d1fe2edca") : secret "isvc-sklearn-graph-raw-hpa-3f0c4-predictor-serving-cert" not found Apr 25 00:07:49.122389 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.122328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.122638 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.122617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.131177 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.131158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brf2w\" (UniqueName: \"kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.392187 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.392092 2576 generic.go:358] "Generic (PLEG): container finished" podID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerID="b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606" exitCode=2 Apr 25 00:07:49.392187 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.392168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerDied","Data":"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606"} Apr 25 00:07:49.627189 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.627152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.629675 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.629658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:49.915120 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:49.915077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:50.045578 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:50.045552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:07:50.047765 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:07:50.047733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94574d49_dd0d_4611_b527_a47d1fe2edca.slice/crio-792b5fd8cc264e41489dc60e231f62d2f85f20d6ba49748c6f4f1b42cd046569 WatchSource:0}: Error finding container 792b5fd8cc264e41489dc60e231f62d2f85f20d6ba49748c6f4f1b42cd046569: Status 404 returned error can't find the container with id 792b5fd8cc264e41489dc60e231f62d2f85f20d6ba49748c6f4f1b42cd046569 Apr 25 00:07:50.396775 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:50.396735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerStarted","Data":"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f"} Apr 25 00:07:50.396775 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:50.396773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerStarted","Data":"792b5fd8cc264e41489dc60e231f62d2f85f20d6ba49748c6f4f1b42cd046569"} Apr 25 00:07:51.014234 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:51.014184 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 25 00:07:51.019642 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:51.019587 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:07:52.998356 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:52.998334 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:07:53.061133 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061110 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " Apr 25 00:07:53.061282 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls\") pod \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " Apr 25 00:07:53.061282 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location\") pod \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " Apr 25 00:07:53.061282 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061222 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbkp2\" (UniqueName: \"kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2\") pod \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\" (UID: \"051a1f66-e6a7-47a9-8eab-ec1836b062fa\") " Apr 25 00:07:53.061573 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "051a1f66-e6a7-47a9-8eab-ec1836b062fa" (UID: "051a1f66-e6a7-47a9-8eab-ec1836b062fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:53.061638 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.061547 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config") pod "051a1f66-e6a7-47a9-8eab-ec1836b062fa" (UID: "051a1f66-e6a7-47a9-8eab-ec1836b062fa"). InnerVolumeSpecName "isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:07:53.063684 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.063659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "051a1f66-e6a7-47a9-8eab-ec1836b062fa" (UID: "051a1f66-e6a7-47a9-8eab-ec1836b062fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:07:53.063832 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.063687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2" (OuterVolumeSpecName: "kube-api-access-fbkp2") pod "051a1f66-e6a7-47a9-8eab-ec1836b062fa" (UID: "051a1f66-e6a7-47a9-8eab-ec1836b062fa"). InnerVolumeSpecName "kube-api-access-fbkp2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:07:53.161877 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.161833 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/051a1f66-e6a7-47a9-8eab-ec1836b062fa-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.161877 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.161874 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.162094 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.161888 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbkp2\" (UniqueName: \"kubernetes.io/projected/051a1f66-e6a7-47a9-8eab-ec1836b062fa-kube-api-access-fbkp2\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.162094 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.161901 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/051a1f66-e6a7-47a9-8eab-ec1836b062fa-isvc-sklearn-graph-raw-3b499-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.374301 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.374200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:53.408599 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.408566 2576 generic.go:358] "Generic (PLEG): container finished" podID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerID="b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f" exitCode=0 Apr 25 00:07:53.408776 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.408644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerDied","Data":"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f"} Apr 25 00:07:53.408776 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.408651 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" Apr 25 00:07:53.408776 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.408681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv" event={"ID":"051a1f66-e6a7-47a9-8eab-ec1836b062fa","Type":"ContainerDied","Data":"018b1883d2d7ab5ff9902afecfc9aa9edbdd455a5ebc0b3c68fbeb67d8f93e11"} Apr 25 00:07:53.408776 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.408723 2576 scope.go:117] "RemoveContainer" containerID="b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606" Apr 25 00:07:53.417958 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.417941 2576 scope.go:117] "RemoveContainer" containerID="b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f" Apr 25 00:07:53.426156 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.426133 2576 scope.go:117] "RemoveContainer" containerID="27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27" Apr 25 00:07:53.428795 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.428768 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:07:53.432317 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.432292 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b499-predictor-756fc45b6c-ffgpv"] Apr 25 00:07:53.435741 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.435723 2576 scope.go:117] "RemoveContainer" containerID="b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606" Apr 25 00:07:53.435958 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:07:53.435943 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606\": container with ID starting with b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606 not found: ID does not exist" containerID="b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606" Apr 25 00:07:53.436011 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.435966 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606"} err="failed to get container status \"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606\": rpc error: code = NotFound desc = could not find container \"b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606\": container with ID starting with b35ee0c4a1e2be8b9e1e22f4ca789cff6c6f477a05810f3c6cd41ba015a74606 not found: ID does not exist" Apr 25 00:07:53.436011 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.435981 2576 scope.go:117] "RemoveContainer" containerID="b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f" Apr 25 00:07:53.436208 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:07:53.436190 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f\": container with ID starting with b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f not found: ID does not exist" containerID="b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f" Apr 25 00:07:53.436251 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.436215 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f"} err="failed to get container status \"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f\": rpc error: code = NotFound desc = could not find container \"b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f\": container with ID starting with b72d5f2c13d16e34d739e63ef19c76fd24535f684b3b70542147cacc7db6663f not found: ID does not exist" Apr 25 00:07:53.436251 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.436231 2576 scope.go:117] "RemoveContainer" containerID="27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27" Apr 25 00:07:53.436451 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:07:53.436434 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27\": container with ID starting with 27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27 not found: ID does not exist" containerID="27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27" Apr 25 00:07:53.436490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:53.436454 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27"} err="failed to get container status \"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27\": rpc error: code = NotFound desc = could not find container \"27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27\": container with ID starting with 27f27361149968ea2104d0264894def205267aad017b21922e32221d34081f27 not found: ID does not exist" Apr 25 00:07:54.417250 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:54.417213 2576 generic.go:358] "Generic (PLEG): container finished" podID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerID="64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f" exitCode=0 Apr 25 00:07:54.417738 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:54.417293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerDied","Data":"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f"} Apr 25 00:07:55.406755 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:55.406686 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" path="/var/lib/kubelet/pods/051a1f66-e6a7-47a9-8eab-ec1836b062fa/volumes" Apr 25 00:07:55.422162 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:55.422131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerStarted","Data":"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792"} Apr 25 00:07:55.422528 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:55.422174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerStarted","Data":"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567"} Apr 25 00:07:55.422528 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:55.422393 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:55.440245 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:55.440196 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podStartSLOduration=7.440185824 podStartE2EDuration="7.440185824s" podCreationTimestamp="2026-04-25 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:07:55.438982704 +0000 UTC m=+836.599932357" watchObservedRunningTime="2026-04-25 00:07:55.440185824 +0000 UTC m=+836.601135464" Apr 25 00:07:56.426312 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:56.426282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:07:56.427535 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:56.427509 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:07:57.429950 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:57.429913 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:07:58.373392 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:07:58.373347 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:02.436100 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:02.436071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:08:02.436664 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:02.436634 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:03.373550 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:03.373509 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:03.373756 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:03.373612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:08:08.373444 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:08.373405 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:12.437357 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:12.437312 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:13.373655 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:13.373619 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:18.373786 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:18.373742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:18.828328 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:08:18.828290 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-conmon-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:08:18.828438 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:08:18.828326 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-conmon-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:08:18.828483 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:08:18.828435 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-conmon-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:08:18.828610 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:08:18.828580 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cd8436_3894_46a1_8eff_e943920e9581.slice/crio-conmon-8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:08:18.957040 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:18.957014 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:08:19.084853 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.084768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls\") pod \"98cd8436-3894-46a1-8eff-e943920e9581\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " Apr 25 00:08:19.085007 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.084861 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle\") pod \"98cd8436-3894-46a1-8eff-e943920e9581\" (UID: \"98cd8436-3894-46a1-8eff-e943920e9581\") " Apr 25 00:08:19.085210 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.085181 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "98cd8436-3894-46a1-8eff-e943920e9581" (UID: "98cd8436-3894-46a1-8eff-e943920e9581"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:08:19.087063 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.087034 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "98cd8436-3894-46a1-8eff-e943920e9581" (UID: "98cd8436-3894-46a1-8eff-e943920e9581"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:08:19.185403 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.185354 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98cd8436-3894-46a1-8eff-e943920e9581-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:08:19.185403 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.185399 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cd8436-3894-46a1-8eff-e943920e9581-openshift-service-ca-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:08:19.510779 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.510743 2576 generic.go:358] "Generic (PLEG): container finished" podID="98cd8436-3894-46a1-8eff-e943920e9581" containerID="8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857" exitCode=0 Apr 25 00:08:19.511230 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.510802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" event={"ID":"98cd8436-3894-46a1-8eff-e943920e9581","Type":"ContainerDied","Data":"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857"} Apr 25 00:08:19.511230 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.510807 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" Apr 25 00:08:19.511230 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.510831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk" event={"ID":"98cd8436-3894-46a1-8eff-e943920e9581","Type":"ContainerDied","Data":"0ebc009cd87ae33df1200972e0a3061ce7c71d692beb59fffd145b4a811acbfb"} Apr 25 00:08:19.511230 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.510852 2576 scope.go:117] "RemoveContainer" containerID="8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857" Apr 25 00:08:19.519527 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.519510 2576 scope.go:117] "RemoveContainer" containerID="8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857" Apr 25 00:08:19.519788 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:08:19.519766 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857\": container with ID starting with 8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857 not found: ID does not exist" containerID="8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857" Apr 25 00:08:19.519869 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.519794 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857"} err="failed to get container status \"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857\": rpc error: code = NotFound desc = could not find container \"8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857\": container with ID starting with 8b6b81de8b88795a1d8b081de5ad42cb1c590349f6b89095aafa9e8a3afcf857 not found: ID does not exist" Apr 25 00:08:19.527070 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.527047 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:08:19.530732 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:19.530710 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b499-57f97cb5b9-zc5fk"] Apr 25 00:08:21.399999 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:21.399961 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cd8436-3894-46a1-8eff-e943920e9581" path="/var/lib/kubelet/pods/98cd8436-3894-46a1-8eff-e943920e9581/volumes" Apr 25 00:08:22.437558 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:22.437520 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:32.437189 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:32.437146 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:42.436893 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:42.436855 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:52.436733 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:52.436675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:08:59.344827 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:59.344795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:08:59.347058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:59.347030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:08:59.349255 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:59.349225 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:08:59.351356 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:08:59.351335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:09:02.437849 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:02.437822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:09:19.049623 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.049580 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:09:19.050049 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050033 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050050 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050067 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="storage-initializer" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050074 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="storage-initializer" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050080 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" Apr 25 00:09:19.050093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050093 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kube-rbac-proxy" Apr 25 00:09:19.050277 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050098 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kube-rbac-proxy" Apr 25 00:09:19.050277 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050163 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kserve-container" Apr 25 00:09:19.050277 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050172 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="051a1f66-e6a7-47a9-8eab-ec1836b062fa" containerName="kube-rbac-proxy" Apr 25 00:09:19.050277 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.050183 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="98cd8436-3894-46a1-8eff-e943920e9581" containerName="model-chainer-raw-3b499" Apr 25 00:09:19.053165 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.053149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.055466 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.055445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\"" Apr 25 00:09:19.055466 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.055454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3f0c4-serving-cert\"" Apr 25 00:09:19.061976 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.061953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:09:19.167432 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.167406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.167576 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.167461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.268777 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.268741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.268936 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.268867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.269442 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.269418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.271301 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.271281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls\") pod \"model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.364884 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.364797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.489864 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.489836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:09:19.492035 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:09:19.492006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a73f646_e3e6_4acc_a61e_c5317041365a.slice/crio-d9ab0eba1270ff5dd1cb1efe88a3ff34abb07dbd6227a37a9e74d97b45251a15 WatchSource:0}: Error finding container d9ab0eba1270ff5dd1cb1efe88a3ff34abb07dbd6227a37a9e74d97b45251a15: Status 404 returned error can't find the container with id d9ab0eba1270ff5dd1cb1efe88a3ff34abb07dbd6227a37a9e74d97b45251a15 Apr 25 00:09:19.726394 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.726299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" event={"ID":"3a73f646-e3e6-4acc-a61e-c5317041365a","Type":"ContainerStarted","Data":"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4"} Apr 25 00:09:19.726394 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.726341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" event={"ID":"3a73f646-e3e6-4acc-a61e-c5317041365a","Type":"ContainerStarted","Data":"d9ab0eba1270ff5dd1cb1efe88a3ff34abb07dbd6227a37a9e74d97b45251a15"} Apr 25 00:09:19.726624 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.726442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:19.742682 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:19.742637 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podStartSLOduration=0.74262451 podStartE2EDuration="742.62451ms" podCreationTimestamp="2026-04-25 00:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:09:19.740955998 +0000 UTC m=+920.901905651" watchObservedRunningTime="2026-04-25 00:09:19.74262451 +0000 UTC m=+920.903574175" Apr 25 00:09:25.736196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:25.736168 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:29.096433 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.096398 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:09:29.096916 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.096609 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" containerID="cri-o://1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4" gracePeriod=30 Apr 25 00:09:29.248517 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.248474 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:09:29.248874 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.248850 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" containerID="cri-o://c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567" gracePeriod=30 Apr 25 00:09:29.248970 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.248881 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kube-rbac-proxy" containerID="cri-o://e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792" gracePeriod=30 Apr 25 00:09:29.763975 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.763947 2576 generic.go:358] "Generic (PLEG): container finished" podID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerID="e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792" exitCode=2 Apr 25 00:09:29.764126 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:29.764025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerDied","Data":"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792"} Apr 25 00:09:30.734451 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:30.734413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:32.430606 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:32.430565 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 25 00:09:32.436657 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:32.436631 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 25 00:09:33.295657 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.295636 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:09:33.394465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394376 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location\") pod \"94574d49-dd0d-4611-b527-a47d1fe2edca\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " Apr 25 00:09:33.394465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394427 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") pod \"94574d49-dd0d-4611-b527-a47d1fe2edca\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " Apr 25 00:09:33.394735 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394498 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"94574d49-dd0d-4611-b527-a47d1fe2edca\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " Apr 25 00:09:33.394735 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394534 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brf2w\" (UniqueName: \"kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w\") pod \"94574d49-dd0d-4611-b527-a47d1fe2edca\" (UID: \"94574d49-dd0d-4611-b527-a47d1fe2edca\") " Apr 25 00:09:33.394859 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394767 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94574d49-dd0d-4611-b527-a47d1fe2edca" (UID: "94574d49-dd0d-4611-b527-a47d1fe2edca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:33.394934 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.394911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config") pod "94574d49-dd0d-4611-b527-a47d1fe2edca" (UID: "94574d49-dd0d-4611-b527-a47d1fe2edca"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:09:33.396730 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.396662 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "94574d49-dd0d-4611-b527-a47d1fe2edca" (UID: "94574d49-dd0d-4611-b527-a47d1fe2edca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:33.396846 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.396779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w" (OuterVolumeSpecName: "kube-api-access-brf2w") pod "94574d49-dd0d-4611-b527-a47d1fe2edca" (UID: "94574d49-dd0d-4611-b527-a47d1fe2edca"). InnerVolumeSpecName "kube-api-access-brf2w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:33.495647 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.495614 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94574d49-dd0d-4611-b527-a47d1fe2edca-isvc-sklearn-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.495647 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.495643 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brf2w\" (UniqueName: \"kubernetes.io/projected/94574d49-dd0d-4611-b527-a47d1fe2edca-kube-api-access-brf2w\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.495647 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.495654 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94574d49-dd0d-4611-b527-a47d1fe2edca-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.496087 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.495663 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94574d49-dd0d-4611-b527-a47d1fe2edca-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.782861 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.782832 2576 generic.go:358] "Generic (PLEG): container finished" podID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerID="c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567" exitCode=0 Apr 25 00:09:33.783017 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.782888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerDied","Data":"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567"} Apr 25 00:09:33.783017 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.782909 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" Apr 25 00:09:33.783017 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.782927 2576 scope.go:117] "RemoveContainer" containerID="e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792" Apr 25 00:09:33.783145 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.782916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt" event={"ID":"94574d49-dd0d-4611-b527-a47d1fe2edca","Type":"ContainerDied","Data":"792b5fd8cc264e41489dc60e231f62d2f85f20d6ba49748c6f4f1b42cd046569"} Apr 25 00:09:33.791344 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.791325 2576 scope.go:117] "RemoveContainer" containerID="c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567" Apr 25 00:09:33.799644 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.799626 2576 scope.go:117] "RemoveContainer" containerID="64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f" Apr 25 00:09:33.800071 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.800053 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:09:33.802849 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.802829 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3f0c4-predictor-759c5664f9-trsgt"] Apr 25 00:09:33.807189 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.807173 2576 scope.go:117] "RemoveContainer" containerID="e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792" Apr 25 00:09:33.807466 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:33.807447 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792\": container with ID starting with e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792 not found: ID does not exist" containerID="e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792" Apr 25 00:09:33.807536 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.807480 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792"} err="failed to get container status \"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792\": rpc error: code = NotFound desc = could not find container \"e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792\": container with ID starting with e060ff2a4b3c20954909c76c466bf20cb229559821422578ba72335dc17a0792 not found: ID does not exist" Apr 25 00:09:33.807536 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.807508 2576 scope.go:117] "RemoveContainer" containerID="c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567" Apr 25 00:09:33.807776 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:33.807758 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567\": container with ID starting with c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567 not found: ID does not exist" containerID="c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567" Apr 25 00:09:33.807822 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.807782 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567"} err="failed to get container status \"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567\": rpc error: code = NotFound desc = could not find container \"c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567\": container with ID starting with c9dea6134a698aef721318265acab816fbe5961e34fe751241acb2c52975f567 not found: ID does not exist" Apr 25 00:09:33.807822 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.807799 2576 scope.go:117] "RemoveContainer" containerID="64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f" Apr 25 00:09:33.808002 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:33.807988 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f\": container with ID starting with 64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f not found: ID does not exist" containerID="64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f" Apr 25 00:09:33.808045 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:33.808005 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f"} err="failed to get container status \"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f\": rpc error: code = NotFound desc = could not find container \"64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f\": container with ID starting with 64288967fef535b8430738760ae63045f1b0af1b31c0d4d17eeb55c44178ab5f not found: ID does not exist" Apr 25 00:09:35.399564 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:35.399532 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" path="/var/lib/kubelet/pods/94574d49-dd0d-4611-b527-a47d1fe2edca/volumes" Apr 25 00:09:35.734244 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:35.734146 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:39.335880 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.335838 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:09:39.336324 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336308 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" Apr 25 00:09:39.336372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336327 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" Apr 25 00:09:39.336372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336343 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="storage-initializer" Apr 25 00:09:39.336372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336349 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="storage-initializer" Apr 25 00:09:39.336372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336358 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kube-rbac-proxy" Apr 25 00:09:39.336372 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336364 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kube-rbac-proxy" Apr 25 00:09:39.336519 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336426 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kube-rbac-proxy" Apr 25 00:09:39.336519 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.336435 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="94574d49-dd0d-4611-b527-a47d1fe2edca" containerName="kserve-container" Apr 25 00:09:39.341324 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.341305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.343621 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.343597 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:09:39.343894 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.343878 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-b8788-predictor-serving-cert\"" Apr 25 00:09:39.343981 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.343900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\"" Apr 25 00:09:39.349466 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.349446 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:09:39.447942 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.447913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.447942 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.447943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.448120 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.447981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.448120 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.448029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z98l\" (UniqueName: \"kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.548655 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.548629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.548799 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.548658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.548799 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.548715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.548799 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.548743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z98l\" (UniqueName: \"kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.548978 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:39.548805 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-serving-cert: secret "isvc-logger-raw-b8788-predictor-serving-cert" not found Apr 25 00:09:39.548978 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:39.548873 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls podName:6306d78c-102c-40cb-8924-588efa962ea2 nodeName:}" failed. No retries permitted until 2026-04-25 00:09:40.048851473 +0000 UTC m=+941.209801101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls") pod "isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" (UID: "6306d78c-102c-40cb-8924-588efa962ea2") : secret "isvc-logger-raw-b8788-predictor-serving-cert" not found Apr 25 00:09:39.549093 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.549075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.549339 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.549320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:39.559640 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:39.559619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z98l\" (UniqueName: \"kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:40.053835 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.053791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:40.056418 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.056399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") pod \"isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:40.253383 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.253338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:40.386465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.386438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:09:40.388552 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:09:40.388528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6306d78c_102c_40cb_8924_588efa962ea2.slice/crio-2cb2cebbf3612bbcf4eea706a4197a727e2447530fef0a626b9f2a19669923d1 WatchSource:0}: Error finding container 2cb2cebbf3612bbcf4eea706a4197a727e2447530fef0a626b9f2a19669923d1: Status 404 returned error can't find the container with id 2cb2cebbf3612bbcf4eea706a4197a727e2447530fef0a626b9f2a19669923d1 Apr 25 00:09:40.735456 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.735364 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:40.735597 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.735465 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:40.809619 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.809584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerStarted","Data":"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681"} Apr 25 00:09:40.809619 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:40.809618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerStarted","Data":"2cb2cebbf3612bbcf4eea706a4197a727e2447530fef0a626b9f2a19669923d1"} Apr 25 00:09:44.827511 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:44.827476 2576 generic.go:358] "Generic (PLEG): container finished" podID="6306d78c-102c-40cb-8924-588efa962ea2" containerID="f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681" exitCode=0 Apr 25 00:09:44.827921 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:44.827554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerDied","Data":"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681"} Apr 25 00:09:45.734806 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.734773 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:45.833035 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.833005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerStarted","Data":"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4"} Apr 25 00:09:45.833430 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.833045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerStarted","Data":"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac"} Apr 25 00:09:45.833430 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.833056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerStarted","Data":"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b"} Apr 25 00:09:45.833430 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.833377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:45.833564 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.833518 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:45.834572 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.834544 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:09:45.851460 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:45.851416 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podStartSLOduration=6.851404385 podStartE2EDuration="6.851404385s" podCreationTimestamp="2026-04-25 00:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:09:45.851091918 +0000 UTC m=+947.012041561" watchObservedRunningTime="2026-04-25 00:09:45.851404385 +0000 UTC m=+947.012354025" Apr 25 00:09:46.836431 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:46.836384 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:09:46.836431 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:46.836418 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:46.837312 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:46.837284 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:47.840486 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:47.840442 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:09:47.840986 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:47.840961 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:50.734969 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:50.734917 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:52.845788 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:52.845756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:09:52.846438 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:52.846399 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:09:52.846599 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:52.846564 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:55.734514 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:55.734476 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:59.275265 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.275234 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:59.323341 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.323309 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls\") pod \"3a73f646-e3e6-4acc-a61e-c5317041365a\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " Apr 25 00:09:59.323462 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.323358 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle\") pod \"3a73f646-e3e6-4acc-a61e-c5317041365a\" (UID: \"3a73f646-e3e6-4acc-a61e-c5317041365a\") " Apr 25 00:09:59.323743 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.323711 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3a73f646-e3e6-4acc-a61e-c5317041365a" (UID: "3a73f646-e3e6-4acc-a61e-c5317041365a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:09:59.325387 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.325352 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3a73f646-e3e6-4acc-a61e-c5317041365a" (UID: "3a73f646-e3e6-4acc-a61e-c5317041365a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:59.424181 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.424153 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a73f646-e3e6-4acc-a61e-c5317041365a-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:59.424181 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.424176 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a73f646-e3e6-4acc-a61e-c5317041365a-openshift-service-ca-bundle\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:09:59.885110 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.885073 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerID="1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4" exitCode=137 Apr 25 00:09:59.885315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.885135 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" Apr 25 00:09:59.885315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.885157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" event={"ID":"3a73f646-e3e6-4acc-a61e-c5317041365a","Type":"ContainerDied","Data":"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4"} Apr 25 00:09:59.885315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.885202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8" event={"ID":"3a73f646-e3e6-4acc-a61e-c5317041365a","Type":"ContainerDied","Data":"d9ab0eba1270ff5dd1cb1efe88a3ff34abb07dbd6227a37a9e74d97b45251a15"} Apr 25 00:09:59.885315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.885222 2576 scope.go:117] "RemoveContainer" containerID="1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4" Apr 25 00:09:59.893674 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.893658 2576 scope.go:117] "RemoveContainer" containerID="1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4" Apr 25 00:09:59.893924 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:09:59.893903 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4\": container with ID starting with 1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4 not found: ID does not exist" containerID="1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4" Apr 25 00:09:59.894007 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.893934 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4"} err="failed to get container status \"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4\": rpc error: code = NotFound desc = could not find container \"1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4\": container with ID starting with 1ad8c80417dcaf0a149b8e4f6545927d7781913c0a460e3094d6172c6a52fec4 not found: ID does not exist" Apr 25 00:09:59.899743 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.899718 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:09:59.902951 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:09:59.902930 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3f0c4-868c7dd7d5-g9vg8"] Apr 25 00:10:01.405716 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:01.405658 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" path="/var/lib/kubelet/pods/3a73f646-e3e6-4acc-a61e-c5317041365a/volumes" Apr 25 00:10:02.846735 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:02.846662 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:10:02.847163 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:02.847129 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:10:12.846972 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:12.846866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:10:12.847433 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:12.847406 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:10:22.846302 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:22.846258 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:10:22.846748 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:22.846727 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:10:32.846587 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:32.846539 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:10:32.847140 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:32.847006 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:10:42.846424 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:42.846378 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:10:42.846884 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:42.846802 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:10:52.846865 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:52.846833 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:10:52.847279 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:10:52.847074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:04.560777 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.560737 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:11:04.561356 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.561268 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" containerID="cri-o://dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b" gracePeriod=30 Apr 25 00:11:04.561356 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.561297 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" containerID="cri-o://fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4" gracePeriod=30 Apr 25 00:11:04.561492 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.561327 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" containerID="cri-o://e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac" gracePeriod=30 Apr 25 00:11:04.603986 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.603952 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:11:04.604444 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.604425 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" Apr 25 00:11:04.604532 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.604446 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" Apr 25 00:11:04.604593 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.604533 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a73f646-e3e6-4acc-a61e-c5317041365a" containerName="model-chainer-raw-hpa-3f0c4" Apr 25 00:11:04.609096 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.609076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.611397 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.611378 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-e81d6-predictor-serving-cert\"" Apr 25 00:11:04.611496 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.611383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\"" Apr 25 00:11:04.616829 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.616806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:11:04.685745 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.685719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p248\" (UniqueName: \"kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.685876 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.685776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.685876 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.685804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.685876 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.685827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.786708 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.786666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p248\" (UniqueName: \"kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.786889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.786748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.786889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.786771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.786889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.786796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.787061 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:04.786918 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-serving-cert: secret "isvc-sklearn-scale-raw-e81d6-predictor-serving-cert" not found Apr 25 00:11:04.787061 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:04.786975 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls podName:2701b787-c103-4814-829e-91bf7d8efe16 nodeName:}" failed. No retries permitted until 2026-04-25 00:11:05.286956033 +0000 UTC m=+1026.447905656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls") pod "isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" (UID: "2701b787-c103-4814-829e-91bf7d8efe16") : secret "isvc-sklearn-scale-raw-e81d6-predictor-serving-cert" not found Apr 25 00:11:04.787184 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.787131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.787476 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.787452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:04.795857 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:04.795828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p248\" (UniqueName: \"kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:05.125212 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.125178 2576 generic.go:358] "Generic (PLEG): container finished" podID="6306d78c-102c-40cb-8924-588efa962ea2" containerID="e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac" exitCode=2 Apr 25 00:11:05.125392 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.125252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerDied","Data":"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac"} Apr 25 00:11:05.291627 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.291596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:05.294139 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.294116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") pod \"isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:05.520518 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.520487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:05.652748 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.652720 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:11:05.654908 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:11:05.654876 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2701b787_c103_4814_829e_91bf7d8efe16.slice/crio-3b1ab29b1de6f41fba549802c2f559442a6c8d9b85d3bf2fa8224247a930914b WatchSource:0}: Error finding container 3b1ab29b1de6f41fba549802c2f559442a6c8d9b85d3bf2fa8224247a930914b: Status 404 returned error can't find the container with id 3b1ab29b1de6f41fba549802c2f559442a6c8d9b85d3bf2fa8224247a930914b Apr 25 00:11:05.656821 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:05.656800 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:11:06.130590 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:06.130546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerStarted","Data":"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e"} Apr 25 00:11:06.130590 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:06.130592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerStarted","Data":"3b1ab29b1de6f41fba549802c2f559442a6c8d9b85d3bf2fa8224247a930914b"} Apr 25 00:11:07.841160 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:07.841114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:09.143885 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:09.143852 2576 generic.go:358] "Generic (PLEG): container finished" podID="6306d78c-102c-40cb-8924-588efa962ea2" containerID="dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b" exitCode=0 Apr 25 00:11:09.144234 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:09.143930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerDied","Data":"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b"} Apr 25 00:11:10.148869 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:10.148832 2576 generic.go:358] "Generic (PLEG): container finished" podID="2701b787-c103-4814-829e-91bf7d8efe16" containerID="2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e" exitCode=0 Apr 25 00:11:10.149274 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:10.148886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerDied","Data":"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e"} Apr 25 00:11:11.154840 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:11.154804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerStarted","Data":"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4"} Apr 25 00:11:11.154840 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:11.154845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerStarted","Data":"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1"} Apr 25 00:11:11.155279 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:11.155053 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:11.177315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:11.177267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podStartSLOduration=7.177236667 podStartE2EDuration="7.177236667s" podCreationTimestamp="2026-04-25 00:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:11.175292711 +0000 UTC m=+1032.336242353" watchObservedRunningTime="2026-04-25 00:11:11.177236667 +0000 UTC m=+1032.338186307" Apr 25 00:11:12.160994 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:12.160951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:12.162135 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:12.162106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:12.840812 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:12.840764 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:12.847146 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:12.847114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:11:12.847453 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:12.847433 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:13.164283 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:13.164191 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:17.841272 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:17.841227 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:17.841766 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:17.841402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:18.170144 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:18.170059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:11:18.170658 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:18.170633 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:22.841001 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:22.840953 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:22.846500 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:22.846475 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:11:22.846893 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:22.846867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:27.841242 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:27.841192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:28.170667 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:28.170579 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:32.841122 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:32.841072 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:11:32.846486 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:32.846454 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:11:32.846611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:32.846595 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:32.846857 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:32.846828 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:32.846957 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:32.846944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:34.716488 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.716464 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:34.848147 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848053 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\") pod \"6306d78c-102c-40cb-8924-588efa962ea2\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " Apr 25 00:11:34.848147 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848090 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") pod \"6306d78c-102c-40cb-8924-588efa962ea2\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " Apr 25 00:11:34.848388 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848214 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location\") pod \"6306d78c-102c-40cb-8924-588efa962ea2\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " Apr 25 00:11:34.848388 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848252 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z98l\" (UniqueName: \"kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l\") pod \"6306d78c-102c-40cb-8924-588efa962ea2\" (UID: \"6306d78c-102c-40cb-8924-588efa962ea2\") " Apr 25 00:11:34.848506 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848482 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-b8788-kube-rbac-proxy-sar-config") pod "6306d78c-102c-40cb-8924-588efa962ea2" (UID: "6306d78c-102c-40cb-8924-588efa962ea2"). InnerVolumeSpecName "isvc-logger-raw-b8788-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:34.848560 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.848508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6306d78c-102c-40cb-8924-588efa962ea2" (UID: "6306d78c-102c-40cb-8924-588efa962ea2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:11:34.850506 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.850475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l" (OuterVolumeSpecName: "kube-api-access-2z98l") pod "6306d78c-102c-40cb-8924-588efa962ea2" (UID: "6306d78c-102c-40cb-8924-588efa962ea2"). InnerVolumeSpecName "kube-api-access-2z98l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:34.850623 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.850520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6306d78c-102c-40cb-8924-588efa962ea2" (UID: "6306d78c-102c-40cb-8924-588efa962ea2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:34.949415 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.949368 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6306d78c-102c-40cb-8924-588efa962ea2-isvc-logger-raw-b8788-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:11:34.949415 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.949407 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6306d78c-102c-40cb-8924-588efa962ea2-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:11:34.949415 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.949421 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6306d78c-102c-40cb-8924-588efa962ea2-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:11:34.949659 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:34.949434 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2z98l\" (UniqueName: \"kubernetes.io/projected/6306d78c-102c-40cb-8924-588efa962ea2-kube-api-access-2z98l\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:11:35.241025 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.240927 2576 generic.go:358] "Generic (PLEG): container finished" podID="6306d78c-102c-40cb-8924-588efa962ea2" containerID="fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4" exitCode=0 Apr 25 00:11:35.241196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.241016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerDied","Data":"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4"} Apr 25 00:11:35.241196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.241062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" event={"ID":"6306d78c-102c-40cb-8924-588efa962ea2","Type":"ContainerDied","Data":"2cb2cebbf3612bbcf4eea706a4197a727e2447530fef0a626b9f2a19669923d1"} Apr 25 00:11:35.241196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.241079 2576 scope.go:117] "RemoveContainer" containerID="fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4" Apr 25 00:11:35.241196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.241029 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k" Apr 25 00:11:35.251421 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.251392 2576 scope.go:117] "RemoveContainer" containerID="e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac" Apr 25 00:11:35.258836 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.258816 2576 scope.go:117] "RemoveContainer" containerID="dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b" Apr 25 00:11:35.263810 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.263789 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:11:35.267300 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.267277 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b8788-predictor-6cfdc7f55c-95t5k"] Apr 25 00:11:35.268012 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.267982 2576 scope.go:117] "RemoveContainer" containerID="f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681" Apr 25 00:11:35.275334 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.275319 2576 scope.go:117] "RemoveContainer" containerID="fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4" Apr 25 00:11:35.275555 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:35.275537 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4\": container with ID starting with fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4 not found: ID does not exist" containerID="fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4" Apr 25 00:11:35.275604 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.275564 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4"} err="failed to get container status \"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4\": rpc error: code = NotFound desc = could not find container \"fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4\": container with ID starting with fbb50259e7d5775933429c2677b0cba44bd4210c54b911387fa38fbc9c866ed4 not found: ID does not exist" Apr 25 00:11:35.275604 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.275583 2576 scope.go:117] "RemoveContainer" containerID="e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac" Apr 25 00:11:35.275855 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:35.275836 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac\": container with ID starting with e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac not found: ID does not exist" containerID="e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac" Apr 25 00:11:35.275905 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.275861 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac"} err="failed to get container status \"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac\": rpc error: code = NotFound desc = could not find container \"e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac\": container with ID starting with e67d9b637d1c3901ba086e8266f79c880dcf29e2c074f1f9b4f866fd91b03dac not found: ID does not exist" Apr 25 00:11:35.275905 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.275877 2576 scope.go:117] "RemoveContainer" containerID="dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b" Apr 25 00:11:35.276106 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:35.276089 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b\": container with ID starting with dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b not found: ID does not exist" containerID="dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b" Apr 25 00:11:35.276151 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.276120 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b"} err="failed to get container status \"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b\": rpc error: code = NotFound desc = could not find container \"dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b\": container with ID starting with dbaab9529b63e85cc2ede40e6368c3232fd7d21ee440412ad6a4a9ba29b8792b not found: ID does not exist" Apr 25 00:11:35.276151 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.276136 2576 scope.go:117] "RemoveContainer" containerID="f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681" Apr 25 00:11:35.276366 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:11:35.276342 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681\": container with ID starting with f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681 not found: ID does not exist" containerID="f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681" Apr 25 00:11:35.276419 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.276366 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681"} err="failed to get container status \"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681\": rpc error: code = NotFound desc = could not find container \"f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681\": container with ID starting with f06c1a2144cda137009cd97f4e33ef65eb7143223a976771c76157e16c5a0681 not found: ID does not exist" Apr 25 00:11:35.401069 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:35.401032 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6306d78c-102c-40cb-8924-588efa962ea2" path="/var/lib/kubelet/pods/6306d78c-102c-40cb-8924-588efa962ea2/volumes" Apr 25 00:11:38.170626 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:38.170537 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:48.170912 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:48.170868 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:11:58.170797 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:11:58.170757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:08.171470 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:08.171431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:18.171441 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:18.171399 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:28.171256 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:28.171217 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:38.170956 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:38.170916 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:38.395743 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:38.395677 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:48.396193 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:48.396148 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:12:58.395917 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:12:58.395874 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:13:08.396120 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:08.396028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:13:18.396705 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:18.396655 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:13:28.396973 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:28.396935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:13:34.786795 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.786762 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:13:34.787253 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.787145 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" containerID="cri-o://3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1" gracePeriod=30 Apr 25 00:13:34.787397 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.787213 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" containerID="cri-o://1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4" gracePeriod=30 Apr 25 00:13:34.897018 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.896985 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:13:34.897411 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897397 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" Apr 25 00:13:34.897457 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897415 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" Apr 25 00:13:34.897457 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897435 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" Apr 25 00:13:34.897457 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897440 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" Apr 25 00:13:34.897457 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897451 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="storage-initializer" Apr 25 00:13:34.897457 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897457 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="storage-initializer" Apr 25 00:13:34.897611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897474 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" Apr 25 00:13:34.897611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" Apr 25 00:13:34.897611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897539 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kserve-container" Apr 25 00:13:34.897611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897549 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="kube-rbac-proxy" Apr 25 00:13:34.897611 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.897559 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6306d78c-102c-40cb-8924-588efa962ea2" containerName="agent" Apr 25 00:13:34.899943 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.899927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:34.902076 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.901986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-bbf281-predictor-serving-cert\"" Apr 25 00:13:34.902076 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.902024 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-bbf281-kube-rbac-proxy-sar-config\"" Apr 25 00:13:34.908446 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:34.908422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:13:35.040123 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.040036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.040123 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.040081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.040123 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.040114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7tk\" (UniqueName: \"kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.040338 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.040174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.142478 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.141627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.142478 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.141731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.142478 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.141796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7tk\" (UniqueName: \"kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.142478 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.141882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.142478 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.142427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.143083 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.143053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.145781 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.145754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.151709 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.151668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7tk\" (UniqueName: \"kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk\") pod \"isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.211647 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.211620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:35.336355 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.336329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:13:35.338368 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:13:35.338334 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbff0fd0_0ea3_4820_b252_cc6b5cfd4216.slice/crio-c0ec22e5171b9aa4112a6aa9ae79a747da22a047dca5d760f56a25f791604485 WatchSource:0}: Error finding container c0ec22e5171b9aa4112a6aa9ae79a747da22a047dca5d760f56a25f791604485: Status 404 returned error can't find the container with id c0ec22e5171b9aa4112a6aa9ae79a747da22a047dca5d760f56a25f791604485 Apr 25 00:13:35.669017 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.668903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerStarted","Data":"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282"} Apr 25 00:13:35.669017 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.668968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerStarted","Data":"c0ec22e5171b9aa4112a6aa9ae79a747da22a047dca5d760f56a25f791604485"} Apr 25 00:13:35.670933 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.670908 2576 generic.go:358] "Generic (PLEG): container finished" podID="2701b787-c103-4814-829e-91bf7d8efe16" containerID="1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4" exitCode=2 Apr 25 00:13:35.671031 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:35.670951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerDied","Data":"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4"} Apr 25 00:13:38.164985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:38.164946 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 25 00:13:38.396092 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:38.396046 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:13:39.687902 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:39.687860 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerID="d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282" exitCode=0 Apr 25 00:13:39.688275 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:39.687934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerDied","Data":"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282"} Apr 25 00:13:40.693136 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:40.693102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerStarted","Data":"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba"} Apr 25 00:13:40.693513 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:40.693146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerStarted","Data":"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61"} Apr 25 00:13:40.693513 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:40.693349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:41.696708 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:41.696658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:41.698169 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:41.698143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:13:42.700131 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:42.700090 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:13:43.164863 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:43.164819 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 25 00:13:44.335145 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.335122 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:13:44.353783 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.353720 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podStartSLOduration=10.353683552 podStartE2EDuration="10.353683552s" podCreationTimestamp="2026-04-25 00:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:13:40.72802868 +0000 UTC m=+1181.888978333" watchObservedRunningTime="2026-04-25 00:13:44.353683552 +0000 UTC m=+1185.514633193" Apr 25 00:13:44.424435 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424349 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\") pod \"2701b787-c103-4814-829e-91bf7d8efe16\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " Apr 25 00:13:44.424435 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424391 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p248\" (UniqueName: \"kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248\") pod \"2701b787-c103-4814-829e-91bf7d8efe16\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " Apr 25 00:13:44.424435 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location\") pod \"2701b787-c103-4814-829e-91bf7d8efe16\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " Apr 25 00:13:44.424750 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424502 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") pod \"2701b787-c103-4814-829e-91bf7d8efe16\" (UID: \"2701b787-c103-4814-829e-91bf7d8efe16\") " Apr 25 00:13:44.424829 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424762 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config") pod "2701b787-c103-4814-829e-91bf7d8efe16" (UID: "2701b787-c103-4814-829e-91bf7d8efe16"). InnerVolumeSpecName "isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:13:44.424829 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.424793 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2701b787-c103-4814-829e-91bf7d8efe16" (UID: "2701b787-c103-4814-829e-91bf7d8efe16"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:13:44.426622 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.426595 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2701b787-c103-4814-829e-91bf7d8efe16" (UID: "2701b787-c103-4814-829e-91bf7d8efe16"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:13:44.426720 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.426625 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248" (OuterVolumeSpecName: "kube-api-access-5p248") pod "2701b787-c103-4814-829e-91bf7d8efe16" (UID: "2701b787-c103-4814-829e-91bf7d8efe16"). InnerVolumeSpecName "kube-api-access-5p248". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:13:44.526045 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.526011 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2701b787-c103-4814-829e-91bf7d8efe16-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:13:44.526045 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.526041 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2701b787-c103-4814-829e-91bf7d8efe16-isvc-sklearn-scale-raw-e81d6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:13:44.526045 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.526052 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5p248\" (UniqueName: \"kubernetes.io/projected/2701b787-c103-4814-829e-91bf7d8efe16-kube-api-access-5p248\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:13:44.526276 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.526062 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2701b787-c103-4814-829e-91bf7d8efe16-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:13:44.708119 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.708031 2576 generic.go:358] "Generic (PLEG): container finished" podID="2701b787-c103-4814-829e-91bf7d8efe16" containerID="3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1" exitCode=0 Apr 25 00:13:44.708119 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.708094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerDied","Data":"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1"} Apr 25 00:13:44.708119 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.708113 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" Apr 25 00:13:44.708380 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.708130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d" event={"ID":"2701b787-c103-4814-829e-91bf7d8efe16","Type":"ContainerDied","Data":"3b1ab29b1de6f41fba549802c2f559442a6c8d9b85d3bf2fa8224247a930914b"} Apr 25 00:13:44.708380 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.708144 2576 scope.go:117] "RemoveContainer" containerID="1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4" Apr 25 00:13:44.719757 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.719736 2576 scope.go:117] "RemoveContainer" containerID="3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1" Apr 25 00:13:44.729080 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.729053 2576 scope.go:117] "RemoveContainer" containerID="2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e" Apr 25 00:13:44.729672 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.729632 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:13:44.733416 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.733387 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e81d6-predictor-6c8678b65-zmk7d"] Apr 25 00:13:44.738440 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.738418 2576 scope.go:117] "RemoveContainer" containerID="1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4" Apr 25 00:13:44.738743 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:13:44.738725 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4\": container with ID starting with 1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4 not found: ID does not exist" containerID="1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4" Apr 25 00:13:44.738835 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.738751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4"} err="failed to get container status \"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4\": rpc error: code = NotFound desc = could not find container \"1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4\": container with ID starting with 1b76e25350da9a92852cc29f76fa072d47d9a960dd1aafca945148c4d3b55bc4 not found: ID does not exist" Apr 25 00:13:44.738835 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.738769 2576 scope.go:117] "RemoveContainer" containerID="3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1" Apr 25 00:13:44.739016 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:13:44.738998 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1\": container with ID starting with 3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1 not found: ID does not exist" containerID="3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1" Apr 25 00:13:44.739065 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.739024 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1"} err="failed to get container status \"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1\": rpc error: code = NotFound desc = could not find container \"3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1\": container with ID starting with 3a1c65d6780d5177a81f15f5b47fb4ff6b22126159cec7e3de23ce1f8368a0b1 not found: ID does not exist" Apr 25 00:13:44.739065 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.739040 2576 scope.go:117] "RemoveContainer" containerID="2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e" Apr 25 00:13:44.739281 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:13:44.739264 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e\": container with ID starting with 2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e not found: ID does not exist" containerID="2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e" Apr 25 00:13:44.739320 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:44.739285 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e"} err="failed to get container status \"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e\": rpc error: code = NotFound desc = could not find container \"2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e\": container with ID starting with 2055e85e5d1f78108b5b7ebffd13d5151b6d37e39713638b0d5c713ad760558e not found: ID does not exist" Apr 25 00:13:45.399969 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:45.399938 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2701b787-c103-4814-829e-91bf7d8efe16" path="/var/lib/kubelet/pods/2701b787-c103-4814-829e-91bf7d8efe16/volumes" Apr 25 00:13:47.704403 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:47.704373 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:13:47.704909 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:47.704884 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:13:57.705870 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:57.705830 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:13:59.377649 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:59.377621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:13:59.380339 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:59.380316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:13:59.382479 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:59.382455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:13:59.384913 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:13:59.384894 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:14:07.704983 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:07.704937 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:14:17.705927 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:17.705881 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:14:27.705853 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:27.705815 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:14:37.705918 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:37.705817 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:14:47.705293 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:47.705250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:14:57.706143 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:14:57.706106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:15:05.083906 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.083870 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:05.084290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084251 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" Apr 25 00:15:05.084290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084261 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" Apr 25 00:15:05.084290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084274 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="storage-initializer" Apr 25 00:15:05.084290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084280 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="storage-initializer" Apr 25 00:15:05.084290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084289 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" Apr 25 00:15:05.084452 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084295 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" Apr 25 00:15:05.084452 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084357 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kserve-container" Apr 25 00:15:05.084452 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.084366 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2701b787-c103-4814-829e-91bf7d8efe16" containerName="kube-rbac-proxy" Apr 25 00:15:05.087631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.087615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.089719 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.089669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-bbf281\"" Apr 25 00:15:05.089850 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.089669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-bbf281-predictor-serving-cert\"" Apr 25 00:15:05.089850 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.089733 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\"" Apr 25 00:15:05.089950 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.089861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 25 00:15:05.089950 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.089908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-bbf281-dockercfg-lx8v9\"" Apr 25 00:15:05.096272 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.096245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.096435 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.096306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.096435 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.096364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztj6w\" (UniqueName: \"kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.096606 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.096438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.096606 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.096471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.098962 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.098941 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:05.197290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.197498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.197498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztj6w\" (UniqueName: \"kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.197498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.197498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.197685 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:05.197515 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-serving-cert: secret "isvc-secondary-bbf281-predictor-serving-cert" not found Apr 25 00:15:05.197685 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:05.197581 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls podName:15cd8bda-e985-47ae-97fd-c320a2448ac4 nodeName:}" failed. No retries permitted until 2026-04-25 00:15:05.697561028 +0000 UTC m=+1266.858510662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls") pod "isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4") : secret "isvc-secondary-bbf281-predictor-serving-cert" not found Apr 25 00:15:05.197892 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.197872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.198021 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.198002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.198059 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.198026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.205429 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.205408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztj6w\" (UniqueName: \"kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.701847 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.701819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.708673 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.705104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:05.998853 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:05.998822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:06.125215 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:06.125190 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:06.127373 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:15:06.127346 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15cd8bda_e985_47ae_97fd_c320a2448ac4.slice/crio-40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073 WatchSource:0}: Error finding container 40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073: Status 404 returned error can't find the container with id 40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073 Apr 25 00:15:07.006830 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:07.006797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerStarted","Data":"14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70"} Apr 25 00:15:07.006830 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:07.006833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerStarted","Data":"40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073"} Apr 25 00:15:10.018640 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:10.018607 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/0.log" Apr 25 00:15:10.019132 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:10.018652 2576 generic.go:358] "Generic (PLEG): container finished" podID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerID="14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70" exitCode=1 Apr 25 00:15:10.019132 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:10.018737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerDied","Data":"14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70"} Apr 25 00:15:11.024746 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:11.024710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/0.log" Apr 25 00:15:11.025183 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:11.024837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerStarted","Data":"d8a7411eaf9684ee9ba394a3b7dde49deca4bfe058ca0b3bbaefe005227349f5"} Apr 25 00:15:14.038365 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.038335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/1.log" Apr 25 00:15:14.038819 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.038674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/0.log" Apr 25 00:15:14.038819 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.038726 2576 generic.go:358] "Generic (PLEG): container finished" podID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerID="d8a7411eaf9684ee9ba394a3b7dde49deca4bfe058ca0b3bbaefe005227349f5" exitCode=1 Apr 25 00:15:14.038819 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.038803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerDied","Data":"d8a7411eaf9684ee9ba394a3b7dde49deca4bfe058ca0b3bbaefe005227349f5"} Apr 25 00:15:14.038938 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.038844 2576 scope.go:117] "RemoveContainer" containerID="14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70" Apr 25 00:15:14.039223 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:14.039209 2576 scope.go:117] "RemoveContainer" containerID="14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70" Apr 25 00:15:14.050733 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:14.050676 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_kserve-ci-e2e-test_15cd8bda-e985-47ae-97fd-c320a2448ac4_0 in pod sandbox 40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073 from index: no such id: '14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70'" containerID="14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70" Apr 25 00:15:14.050876 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:14.050760 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_kserve-ci-e2e-test_15cd8bda-e985-47ae-97fd-c320a2448ac4_0 in pod sandbox 40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073 from index: no such id: '14dd11ac26a6208371b4f4f7b14fb469efd4b4115f6fef9cc6cd325299e2ea70'; Skipping pod \"isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_kserve-ci-e2e-test(15cd8bda-e985-47ae-97fd-c320a2448ac4)\"" logger="UnhandledError" Apr 25 00:15:14.052179 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:14.052154 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_kserve-ci-e2e-test(15cd8bda-e985-47ae-97fd-c320a2448ac4)\"" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" Apr 25 00:15:15.043447 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:15.043421 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/1.log" Apr 25 00:15:21.119537 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.119497 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:21.168272 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.168238 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:15:21.168616 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.168568 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" containerID="cri-o://a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61" gracePeriod=30 Apr 25 00:15:21.168726 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.168593 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kube-rbac-proxy" containerID="cri-o://6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba" gracePeriod=30 Apr 25 00:15:21.261858 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.261822 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:21.267108 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.267082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.269248 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.269218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-8448e3\"" Apr 25 00:15:21.269384 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.269218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-8448e3-dockercfg-4md24\"" Apr 25 00:15:21.269384 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.269348 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8448e3-predictor-serving-cert\"" Apr 25 00:15:21.269736 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.269493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\"" Apr 25 00:15:21.273522 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.273500 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:21.291364 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.291346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/1.log" Apr 25 00:15:21.291466 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.291419 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:21.344685 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.344658 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztj6w\" (UniqueName: \"kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w\") pod \"15cd8bda-e985-47ae-97fd-c320a2448ac4\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " Apr 25 00:15:21.344847 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.344716 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") pod \"15cd8bda-e985-47ae-97fd-c320a2448ac4\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " Apr 25 00:15:21.344847 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.344751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config\") pod \"15cd8bda-e985-47ae-97fd-c320a2448ac4\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " Apr 25 00:15:21.344847 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.344819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location\") pod \"15cd8bda-e985-47ae-97fd-c320a2448ac4\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " Apr 25 00:15:21.345011 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.344915 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert\") pod \"15cd8bda-e985-47ae-97fd-c320a2448ac4\" (UID: \"15cd8bda-e985-47ae-97fd-c320a2448ac4\") " Apr 25 00:15:21.345066 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.345125 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.345125 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phrj5\" (UniqueName: \"kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.345229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15cd8bda-e985-47ae-97fd-c320a2448ac4" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:15:21.345229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.345229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-bbf281-kube-rbac-proxy-sar-config") pod "15cd8bda-e985-47ae-97fd-c320a2448ac4" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4"). InnerVolumeSpecName "isvc-secondary-bbf281-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:21.345428 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "15cd8bda-e985-47ae-97fd-c320a2448ac4" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:21.345503 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.345657 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345637 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-isvc-secondary-bbf281-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:21.345657 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345663 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15cd8bda-e985-47ae-97fd-c320a2448ac4-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:21.345851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.345683 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/15cd8bda-e985-47ae-97fd-c320a2448ac4-cabundle-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:21.347112 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.347085 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "15cd8bda-e985-47ae-97fd-c320a2448ac4" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:15:21.347213 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.347105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w" (OuterVolumeSpecName: "kube-api-access-ztj6w") pod "15cd8bda-e985-47ae-97fd-c320a2448ac4" (UID: "15cd8bda-e985-47ae-97fd-c320a2448ac4"). InnerVolumeSpecName "kube-api-access-ztj6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:15:21.446929 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.446897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.446929 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.446935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447222 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phrj5\" (UniqueName: \"kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447222 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447222 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447366 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447252 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ztj6w\" (UniqueName: \"kubernetes.io/projected/15cd8bda-e985-47ae-97fd-c320a2448ac4-kube-api-access-ztj6w\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:21.447366 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447269 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15cd8bda-e985-47ae-97fd-c320a2448ac4-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:21.447366 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447706 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.447792 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.447763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.449754 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.449734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.455275 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.455258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phrj5\" (UniqueName: \"kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5\") pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.590030 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.589987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:21.716801 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:21.716773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:21.718784 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:15:21.718753 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0339805_0726_40e0_829f_34ba5030726d.slice/crio-cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72 WatchSource:0}: Error finding container cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72: Status 404 returned error can't find the container with id cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72 Apr 25 00:15:22.075232 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.075198 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerID="6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba" exitCode=2 Apr 25 00:15:22.075413 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.075269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerDied","Data":"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba"} Apr 25 00:15:22.076359 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.076340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-bbf281-predictor-7f76d5b559-s84pm_15cd8bda-e985-47ae-97fd-c320a2448ac4/storage-initializer/1.log" Apr 25 00:15:22.076514 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.076475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" event={"ID":"15cd8bda-e985-47ae-97fd-c320a2448ac4","Type":"ContainerDied","Data":"40ee2ad772aa6eb03de535eda7b87676355ca6016896d73f485b8ab61d7b8073"} Apr 25 00:15:22.076631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.076490 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm" Apr 25 00:15:22.076711 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.076529 2576 scope.go:117] "RemoveContainer" containerID="d8a7411eaf9684ee9ba394a3b7dde49deca4bfe058ca0b3bbaefe005227349f5" Apr 25 00:15:22.077964 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.077939 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerStarted","Data":"2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288"} Apr 25 00:15:22.078073 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.077972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerStarted","Data":"cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72"} Apr 25 00:15:22.122418 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.122389 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:22.126545 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.126522 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-bbf281-predictor-7f76d5b559-s84pm"] Apr 25 00:15:22.700780 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:22.700734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 25 00:15:23.399957 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:23.399924 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" path="/var/lib/kubelet/pods/15cd8bda-e985-47ae-97fd-c320a2448ac4/volumes" Apr 25 00:15:25.512607 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.512583 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:15:25.583529 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583447 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config\") pod \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " Apr 25 00:15:25.583529 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583494 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls\") pod \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " Apr 25 00:15:25.583783 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583575 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location\") pod \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " Apr 25 00:15:25.583783 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs7tk\" (UniqueName: \"kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk\") pod \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\" (UID: \"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216\") " Apr 25 00:15:25.583899 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583871 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-bbf281-kube-rbac-proxy-sar-config") pod "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" (UID: "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216"). InnerVolumeSpecName "isvc-primary-bbf281-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:25.583968 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.583942 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" (UID: "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:15:25.585884 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.585863 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk" (OuterVolumeSpecName: "kube-api-access-hs7tk") pod "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" (UID: "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216"). InnerVolumeSpecName "kube-api-access-hs7tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:15:25.585965 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.585920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" (UID: "cbff0fd0-0ea3-4820-b252-cc6b5cfd4216"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:15:25.684756 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.684716 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-bbf281-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-isvc-primary-bbf281-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:25.684756 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.684747 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:25.684756 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.684764 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:25.684982 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:25.684777 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hs7tk\" (UniqueName: \"kubernetes.io/projected/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216-kube-api-access-hs7tk\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:26.094998 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.094962 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerID="a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61" exitCode=0 Apr 25 00:15:26.095196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.095059 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" Apr 25 00:15:26.095196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.095059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerDied","Data":"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61"} Apr 25 00:15:26.095196 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.095177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw" event={"ID":"cbff0fd0-0ea3-4820-b252-cc6b5cfd4216","Type":"ContainerDied","Data":"c0ec22e5171b9aa4112a6aa9ae79a747da22a047dca5d760f56a25f791604485"} Apr 25 00:15:26.095348 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.095213 2576 scope.go:117] "RemoveContainer" containerID="6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba" Apr 25 00:15:26.096606 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.096585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/0.log" Apr 25 00:15:26.096899 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.096627 2576 generic.go:358] "Generic (PLEG): container finished" podID="f0339805-0726-40e0-829f-34ba5030726d" containerID="2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288" exitCode=1 Apr 25 00:15:26.096899 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.096732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerDied","Data":"2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288"} Apr 25 00:15:26.105168 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.105146 2576 scope.go:117] "RemoveContainer" containerID="a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61" Apr 25 00:15:26.113184 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.113157 2576 scope.go:117] "RemoveContainer" containerID="d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282" Apr 25 00:15:26.121226 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.121207 2576 scope.go:117] "RemoveContainer" containerID="6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba" Apr 25 00:15:26.121501 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:26.121484 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba\": container with ID starting with 6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba not found: ID does not exist" containerID="6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba" Apr 25 00:15:26.121563 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.121510 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba"} err="failed to get container status \"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba\": rpc error: code = NotFound desc = could not find container \"6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba\": container with ID starting with 6ababe015db86786a583c7910567a6d0a4692e8dfc68e5e46ebd45dd481939ba not found: ID does not exist" Apr 25 00:15:26.121563 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.121529 2576 scope.go:117] "RemoveContainer" containerID="a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61" Apr 25 00:15:26.121802 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:26.121786 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61\": container with ID starting with a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61 not found: ID does not exist" containerID="a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61" Apr 25 00:15:26.121851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.121806 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61"} err="failed to get container status \"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61\": rpc error: code = NotFound desc = could not find container \"a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61\": container with ID starting with a42a8394a95449aecfe7f679356918c6a070329acf1ebb6e648bf5a08a108d61 not found: ID does not exist" Apr 25 00:15:26.121851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.121820 2576 scope.go:117] "RemoveContainer" containerID="d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282" Apr 25 00:15:26.122067 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:26.122049 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282\": container with ID starting with d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282 not found: ID does not exist" containerID="d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282" Apr 25 00:15:26.122111 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.122074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282"} err="failed to get container status \"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282\": rpc error: code = NotFound desc = could not find container \"d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282\": container with ID starting with d8a9139661ec057985b1dbbbbae2d19625a7845237a4ddf2c816cdc1775da282 not found: ID does not exist" Apr 25 00:15:26.127038 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.126947 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:15:26.130596 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:26.130574 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-bbf281-predictor-6b7cf7ddb4-9mhhw"] Apr 25 00:15:27.102828 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:27.102800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/0.log" Apr 25 00:15:27.103198 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:27.102874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerStarted","Data":"7acf6d49e396a91a2aaf674e0b7d517236834d88368ef815519fb8ec9eeb547a"} Apr 25 00:15:27.407303 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:27.407220 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" path="/var/lib/kubelet/pods/cbff0fd0-0ea3-4820-b252-cc6b5cfd4216/volumes" Apr 25 00:15:30.114544 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.114514 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/1.log" Apr 25 00:15:30.114964 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.114879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/0.log" Apr 25 00:15:30.114964 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.114914 2576 generic.go:358] "Generic (PLEG): container finished" podID="f0339805-0726-40e0-829f-34ba5030726d" containerID="7acf6d49e396a91a2aaf674e0b7d517236834d88368ef815519fb8ec9eeb547a" exitCode=1 Apr 25 00:15:30.114964 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.114952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerDied","Data":"7acf6d49e396a91a2aaf674e0b7d517236834d88368ef815519fb8ec9eeb547a"} Apr 25 00:15:30.115069 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.114984 2576 scope.go:117] "RemoveContainer" containerID="2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288" Apr 25 00:15:30.115396 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:30.115364 2576 scope.go:117] "RemoveContainer" containerID="2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288" Apr 25 00:15:30.126483 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:30.126454 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_kserve-ci-e2e-test_f0339805-0726-40e0-829f-34ba5030726d_0 in pod sandbox cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72 from index: no such id: '2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288'" containerID="2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288" Apr 25 00:15:30.126550 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:30.126506 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_kserve-ci-e2e-test_f0339805-0726-40e0-829f-34ba5030726d_0 in pod sandbox cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72 from index: no such id: '2a22427d863eeeed9a172b5d1ba5808b962dacb87b4dd728e892e39811bcb288'; Skipping pod \"isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_kserve-ci-e2e-test(f0339805-0726-40e0-829f-34ba5030726d)\"" logger="UnhandledError" Apr 25 00:15:30.127896 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:15:30.127866 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_kserve-ci-e2e-test(f0339805-0726-40e0-829f-34ba5030726d)\"" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" podUID="f0339805-0726-40e0-829f-34ba5030726d" Apr 25 00:15:31.119866 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.119832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/1.log" Apr 25 00:15:31.263193 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.263163 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:31.402822 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.402796 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/1.log" Apr 25 00:15:31.402971 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.402859 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:31.410550 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410518 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:15:31.410940 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410924 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" Apr 25 00:15:31.410985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410944 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" Apr 25 00:15:31.410985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.410985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410972 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.410985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410980 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kube-rbac-proxy" Apr 25 00:15:31.410985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410985 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kube-rbac-proxy" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.410999 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411022 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411029 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411035 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411040 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411045 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411104 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411114 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kserve-container" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411124 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411132 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="15cd8bda-e985-47ae-97fd-c320a2448ac4" containerName="storage-initializer" Apr 25 00:15:31.411134 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411139 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbff0fd0-0ea3-4820-b252-cc6b5cfd4216" containerName="kube-rbac-proxy" Apr 25 00:15:31.411500 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.411260 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0339805-0726-40e0-829f-34ba5030726d" containerName="storage-initializer" Apr 25 00:15:31.416052 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.416031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.418067 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.418043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-926cf-predictor-serving-cert\"" Apr 25 00:15:31.418067 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.418057 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-926cf-kube-rbac-proxy-sar-config\"" Apr 25 00:15:31.418244 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.418066 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gz2jj\"" Apr 25 00:15:31.421678 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.421657 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:15:31.537015 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.536988 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert\") pod \"f0339805-0726-40e0-829f-34ba5030726d\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " Apr 25 00:15:31.537175 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\") pod \"f0339805-0726-40e0-829f-34ba5030726d\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " Apr 25 00:15:31.537175 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location\") pod \"f0339805-0726-40e0-829f-34ba5030726d\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " Apr 25 00:15:31.537175 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537084 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phrj5\" (UniqueName: \"kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5\") pod \"f0339805-0726-40e0-829f-34ba5030726d\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " Apr 25 00:15:31.537175 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls\") pod \"f0339805-0726-40e0-829f-34ba5030726d\" (UID: \"f0339805-0726-40e0-829f-34ba5030726d\") " Apr 25 00:15:31.537448 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stwg2\" (UniqueName: \"kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.537448 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.537552 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f0339805-0726-40e0-829f-34ba5030726d" (UID: "f0339805-0726-40e0-829f-34ba5030726d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:15:31.537552 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-8448e3-kube-rbac-proxy-sar-config") pod "f0339805-0726-40e0-829f-34ba5030726d" (UID: "f0339805-0726-40e0-829f-34ba5030726d"). InnerVolumeSpecName "isvc-init-fail-8448e3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:31.537552 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537443 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f0339805-0726-40e0-829f-34ba5030726d" (UID: "f0339805-0726-40e0-829f-34ba5030726d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:31.537677 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-926cf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.537677 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.537677 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537647 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-cabundle-cert\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:31.537841 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537676 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f0339805-0726-40e0-829f-34ba5030726d-isvc-init-fail-8448e3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:31.537841 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.537729 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0339805-0726-40e0-829f-34ba5030726d-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:31.539402 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.539378 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f0339805-0726-40e0-829f-34ba5030726d" (UID: "f0339805-0726-40e0-829f-34ba5030726d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:15:31.539488 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.539440 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5" (OuterVolumeSpecName: "kube-api-access-phrj5") pod "f0339805-0726-40e0-829f-34ba5030726d" (UID: "f0339805-0726-40e0-829f-34ba5030726d"). InnerVolumeSpecName "kube-api-access-phrj5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:15:31.638682 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stwg2\" (UniqueName: \"kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.638682 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.638682 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-926cf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.638682 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.639157 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638774 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0339805-0726-40e0-829f-34ba5030726d-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:31.639157 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.638787 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phrj5\" (UniqueName: \"kubernetes.io/projected/f0339805-0726-40e0-829f-34ba5030726d-kube-api-access-phrj5\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:15:31.639236 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.639167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.639384 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.639363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-926cf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.641357 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.641336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.646363 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.646343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stwg2\" (UniqueName: \"kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2\") pod \"raw-sklearn-926cf-predictor-6fc8899c49-49qgj\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.728559 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.728506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:31.861097 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:31.861047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:15:31.863585 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:15:31.863555 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a9a267_5518_4d44_9461_fe6d2793a548.slice/crio-9c7354451ffa10193e89cc556d2a11ab12a8f66b1874c145dc54870fbd329523 WatchSource:0}: Error finding container 9c7354451ffa10193e89cc556d2a11ab12a8f66b1874c145dc54870fbd329523: Status 404 returned error can't find the container with id 9c7354451ffa10193e89cc556d2a11ab12a8f66b1874c145dc54870fbd329523 Apr 25 00:15:32.124979 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.124950 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8448e3-predictor-84676484d9-mvhgn_f0339805-0726-40e0-829f-34ba5030726d/storage-initializer/1.log" Apr 25 00:15:32.125432 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.125041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" event={"ID":"f0339805-0726-40e0-829f-34ba5030726d","Type":"ContainerDied","Data":"cf80a30631a7770d3aa6e0faec44736a5e4bc595fcc8ca02ea6a0b4a1bbcfa72"} Apr 25 00:15:32.125432 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.125072 2576 scope.go:117] "RemoveContainer" containerID="7acf6d49e396a91a2aaf674e0b7d517236834d88368ef815519fb8ec9eeb547a" Apr 25 00:15:32.125432 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.125070 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn" Apr 25 00:15:32.126807 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.126776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerStarted","Data":"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107"} Apr 25 00:15:32.126908 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.126810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerStarted","Data":"9c7354451ffa10193e89cc556d2a11ab12a8f66b1874c145dc54870fbd329523"} Apr 25 00:15:32.176488 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.176397 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:32.179434 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:32.179404 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8448e3-predictor-84676484d9-mvhgn"] Apr 25 00:15:33.400616 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:33.400575 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0339805-0726-40e0-829f-34ba5030726d" path="/var/lib/kubelet/pods/f0339805-0726-40e0-829f-34ba5030726d/volumes" Apr 25 00:15:36.144198 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:36.144162 2576 generic.go:358] "Generic (PLEG): container finished" podID="69a9a267-5518-4d44-9461-fe6d2793a548" containerID="a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107" exitCode=0 Apr 25 00:15:36.144586 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:36.144225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerDied","Data":"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107"} Apr 25 00:15:37.149539 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.149504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerStarted","Data":"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2"} Apr 25 00:15:37.149539 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.149544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerStarted","Data":"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872"} Apr 25 00:15:37.150060 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.149835 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:37.150060 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.149976 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:37.151241 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.151215 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:15:37.167974 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:37.167934 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podStartSLOduration=6.167925163 podStartE2EDuration="6.167925163s" podCreationTimestamp="2026-04-25 00:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:15:37.166168857 +0000 UTC m=+1298.327118510" watchObservedRunningTime="2026-04-25 00:15:37.167925163 +0000 UTC m=+1298.328874805" Apr 25 00:15:38.153635 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:38.153595 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:15:43.157985 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:43.157951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:15:43.158554 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:43.158528 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:15:53.158646 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:15:53.158602 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:03.158565 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:03.158524 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:13.159149 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:13.159045 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:23.158504 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:23.158457 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:33.158750 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:33.158675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:43.159501 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:43.159471 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:16:51.542128 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.542089 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:16:51.542771 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.542681 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" containerID="cri-o://daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872" gracePeriod=30 Apr 25 00:16:51.542966 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.542763 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kube-rbac-proxy" containerID="cri-o://b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2" gracePeriod=30 Apr 25 00:16:51.620825 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.620794 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:16:51.624483 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.624465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.627474 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.627310 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\"" Apr 25 00:16:51.627474 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.627340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-bca52-predictor-serving-cert\"" Apr 25 00:16:51.634354 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.634320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:16:51.725619 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.725576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvbv\" (UniqueName: \"kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.725851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.725649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.725851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.725724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.725851 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.725812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.827290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.827174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.827290 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.827246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.827537 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.827339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.827537 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.827375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvbv\" (UniqueName: \"kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.827840 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.827813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.828065 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.828048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.829723 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.829686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.835413 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.835387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvbv\" (UniqueName: \"kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv\") pod \"raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:51.936303 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:51.936268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:52.061168 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.061136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:16:52.064041 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:16:52.064009 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce258e6b_bad3_41a8_81c4_90a9a3d84b5b.slice/crio-18142250634f0e31c0c540a28ed8de8f0afbf9736d13676e9fdc4a7767fab41f WatchSource:0}: Error finding container 18142250634f0e31c0c540a28ed8de8f0afbf9736d13676e9fdc4a7767fab41f: Status 404 returned error can't find the container with id 18142250634f0e31c0c540a28ed8de8f0afbf9736d13676e9fdc4a7767fab41f Apr 25 00:16:52.065772 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.065751 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:16:52.421257 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.421157 2576 generic.go:358] "Generic (PLEG): container finished" podID="69a9a267-5518-4d44-9461-fe6d2793a548" containerID="b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2" exitCode=2 Apr 25 00:16:52.421257 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.421235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerDied","Data":"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2"} Apr 25 00:16:52.422773 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.422749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerStarted","Data":"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523"} Apr 25 00:16:52.422888 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:52.422777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerStarted","Data":"18142250634f0e31c0c540a28ed8de8f0afbf9736d13676e9fdc4a7767fab41f"} Apr 25 00:16:53.154371 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:53.154334 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 25 00:16:53.158666 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:53.158643 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:16:55.898029 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.898000 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:16:55.967271 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stwg2\" (UniqueName: \"kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2\") pod \"69a9a267-5518-4d44-9461-fe6d2793a548\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " Apr 25 00:16:55.967271 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967238 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location\") pod \"69a9a267-5518-4d44-9461-fe6d2793a548\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " Apr 25 00:16:55.967465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967294 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-926cf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config\") pod \"69a9a267-5518-4d44-9461-fe6d2793a548\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " Apr 25 00:16:55.967465 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls\") pod \"69a9a267-5518-4d44-9461-fe6d2793a548\" (UID: \"69a9a267-5518-4d44-9461-fe6d2793a548\") " Apr 25 00:16:55.967666 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69a9a267-5518-4d44-9461-fe6d2793a548" (UID: "69a9a267-5518-4d44-9461-fe6d2793a548"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:55.967666 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.967653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-926cf-kube-rbac-proxy-sar-config") pod "69a9a267-5518-4d44-9461-fe6d2793a548" (UID: "69a9a267-5518-4d44-9461-fe6d2793a548"). InnerVolumeSpecName "raw-sklearn-926cf-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:16:55.969353 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.969330 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2" (OuterVolumeSpecName: "kube-api-access-stwg2") pod "69a9a267-5518-4d44-9461-fe6d2793a548" (UID: "69a9a267-5518-4d44-9461-fe6d2793a548"). InnerVolumeSpecName "kube-api-access-stwg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:16:55.969439 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:55.969382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "69a9a267-5518-4d44-9461-fe6d2793a548" (UID: "69a9a267-5518-4d44-9461-fe6d2793a548"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:16:56.068919 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.068862 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stwg2\" (UniqueName: \"kubernetes.io/projected/69a9a267-5518-4d44-9461-fe6d2793a548-kube-api-access-stwg2\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:16:56.068919 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.068908 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69a9a267-5518-4d44-9461-fe6d2793a548-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:16:56.068919 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.068921 2576 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-926cf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69a9a267-5518-4d44-9461-fe6d2793a548-raw-sklearn-926cf-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:16:56.068919 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.068933 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69a9a267-5518-4d44-9461-fe6d2793a548-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:16:56.438233 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.438202 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerID="4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523" exitCode=0 Apr 25 00:16:56.438412 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.438286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerDied","Data":"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523"} Apr 25 00:16:56.439940 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.439918 2576 generic.go:358] "Generic (PLEG): container finished" podID="69a9a267-5518-4d44-9461-fe6d2793a548" containerID="daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872" exitCode=0 Apr 25 00:16:56.440058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.439955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerDied","Data":"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872"} Apr 25 00:16:56.440058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.439982 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" Apr 25 00:16:56.440058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.439988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj" event={"ID":"69a9a267-5518-4d44-9461-fe6d2793a548","Type":"ContainerDied","Data":"9c7354451ffa10193e89cc556d2a11ab12a8f66b1874c145dc54870fbd329523"} Apr 25 00:16:56.440058 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.440009 2576 scope.go:117] "RemoveContainer" containerID="b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2" Apr 25 00:16:56.453035 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.452992 2576 scope.go:117] "RemoveContainer" containerID="daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872" Apr 25 00:16:56.462469 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.462433 2576 scope.go:117] "RemoveContainer" containerID="a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107" Apr 25 00:16:56.469979 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.469951 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:16:56.472249 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472153 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-926cf-predictor-6fc8899c49-49qgj"] Apr 25 00:16:56.472314 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472278 2576 scope.go:117] "RemoveContainer" containerID="b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2" Apr 25 00:16:56.472535 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:16:56.472515 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2\": container with ID starting with b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2 not found: ID does not exist" containerID="b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2" Apr 25 00:16:56.472596 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472544 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2"} err="failed to get container status \"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2\": rpc error: code = NotFound desc = could not find container \"b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2\": container with ID starting with b7b04a3360576aabc8c6004f1fb748ce70a29a41c6b13ad7a4d637b47c45ebe2 not found: ID does not exist" Apr 25 00:16:56.472596 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472562 2576 scope.go:117] "RemoveContainer" containerID="daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872" Apr 25 00:16:56.472838 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:16:56.472820 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872\": container with ID starting with daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872 not found: ID does not exist" containerID="daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872" Apr 25 00:16:56.472889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472846 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872"} err="failed to get container status \"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872\": rpc error: code = NotFound desc = could not find container \"daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872\": container with ID starting with daf5ba359de6b3539a946eb67bbcf50317b5f0e657ec556933a016a579c61872 not found: ID does not exist" Apr 25 00:16:56.472889 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.472862 2576 scope.go:117] "RemoveContainer" containerID="a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107" Apr 25 00:16:56.473101 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:16:56.473082 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107\": container with ID starting with a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107 not found: ID does not exist" containerID="a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107" Apr 25 00:16:56.473142 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:56.473107 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107"} err="failed to get container status \"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107\": rpc error: code = NotFound desc = could not find container \"a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107\": container with ID starting with a19044517dba14f485c0ba29265887ab0cbf2a11834f0b355ce8f07ff9263107 not found: ID does not exist" Apr 25 00:16:57.400028 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.399992 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" path="/var/lib/kubelet/pods/69a9a267-5518-4d44-9461-fe6d2793a548/volumes" Apr 25 00:16:57.444352 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.444311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerStarted","Data":"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040"} Apr 25 00:16:57.444520 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.444360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerStarted","Data":"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2"} Apr 25 00:16:57.444716 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.444669 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:57.444803 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.444736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:16:57.445866 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.445842 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:16:57.462149 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:57.462101 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podStartSLOduration=6.46208695 podStartE2EDuration="6.46208695s" podCreationTimestamp="2026-04-25 00:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:16:57.460635828 +0000 UTC m=+1378.621585469" watchObservedRunningTime="2026-04-25 00:16:57.46208695 +0000 UTC m=+1378.623036591" Apr 25 00:16:58.449715 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:16:58.449663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:03.454112 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:03.454083 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:17:03.454630 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:03.454604 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:13.455276 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:13.455229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:23.455444 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:23.455402 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:33.454625 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:33.454582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:43.455229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:43.455186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:17:53.455095 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:17:53.455004 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:18:03.456096 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:03.456067 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:18:11.725090 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:11.725057 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:18:11.725600 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:11.725350 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" containerID="cri-o://e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2" gracePeriod=30 Apr 25 00:18:11.725600 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:11.725403 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kube-rbac-proxy" containerID="cri-o://19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040" gracePeriod=30 Apr 25 00:18:12.715339 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:12.715300 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerID="19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040" exitCode=2 Apr 25 00:18:12.715545 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:12.715376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerDied","Data":"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040"} Apr 25 00:18:13.450594 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:13.450551 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 25 00:18:13.454855 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:13.454823 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:18:15.873154 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:15.873132 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:18:16.038453 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.038415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls\") pod \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " Apr 25 00:18:16.038631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.038470 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location\") pod \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " Apr 25 00:18:16.038631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.038523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkvbv\" (UniqueName: \"kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv\") pod \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " Apr 25 00:18:16.038631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.038558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\") pod \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\" (UID: \"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b\") " Apr 25 00:18:16.038937 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.038902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" (UID: "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:18:16.039072 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.039030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config") pod "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" (UID: "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b"). InnerVolumeSpecName "raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:18:16.040874 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.040854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" (UID: "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:18:16.040940 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.040868 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv" (OuterVolumeSpecName: "kube-api-access-lkvbv") pod "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" (UID: "ce258e6b-bad3-41a8-81c4-90a9a3d84b5b"). InnerVolumeSpecName "kube-api-access-lkvbv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:18:16.140046 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.140019 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkvbv\" (UniqueName: \"kubernetes.io/projected/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kube-api-access-lkvbv\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:18:16.140046 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.140043 2576 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-raw-sklearn-runtime-bca52-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:18:16.140193 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.140054 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-proxy-tls\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:18:16.140193 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.140064 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b-kserve-provision-location\") on node \"ip-10-0-132-64.ec2.internal\" DevicePath \"\"" Apr 25 00:18:16.738013 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.737970 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerID="e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2" exitCode=0 Apr 25 00:18:16.738229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.738025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerDied","Data":"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2"} Apr 25 00:18:16.738229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.738076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" event={"ID":"ce258e6b-bad3-41a8-81c4-90a9a3d84b5b","Type":"ContainerDied","Data":"18142250634f0e31c0c540a28ed8de8f0afbf9736d13676e9fdc4a7767fab41f"} Apr 25 00:18:16.738229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.738082 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d" Apr 25 00:18:16.738229 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.738098 2576 scope.go:117] "RemoveContainer" containerID="19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040" Apr 25 00:18:16.747255 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.747237 2576 scope.go:117] "RemoveContainer" containerID="e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2" Apr 25 00:18:16.754667 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.754650 2576 scope.go:117] "RemoveContainer" containerID="4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523" Apr 25 00:18:16.760201 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.760178 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:18:16.762926 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.762908 2576 scope.go:117] "RemoveContainer" containerID="19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040" Apr 25 00:18:16.763191 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:18:16.763172 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040\": container with ID starting with 19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040 not found: ID does not exist" containerID="19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040" Apr 25 00:18:16.763261 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763204 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040"} err="failed to get container status \"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040\": rpc error: code = NotFound desc = could not find container \"19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040\": container with ID starting with 19beac1ff4cae711b52e4c5853ad6a9567117c2784398d58ce64943b6496a040 not found: ID does not exist" Apr 25 00:18:16.763261 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763226 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-bca52-predictor-7cc55b85d5-nl97d"] Apr 25 00:18:16.763261 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763230 2576 scope.go:117] "RemoveContainer" containerID="e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2" Apr 25 00:18:16.763487 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:18:16.763470 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2\": container with ID starting with e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2 not found: ID does not exist" containerID="e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2" Apr 25 00:18:16.763551 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763493 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2"} err="failed to get container status \"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2\": rpc error: code = NotFound desc = could not find container \"e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2\": container with ID starting with e525c6afa925498cc66d17c834a9cbc02114a6f326f86f961a9cb633afc21df2 not found: ID does not exist" Apr 25 00:18:16.763551 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763512 2576 scope.go:117] "RemoveContainer" containerID="4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523" Apr 25 00:18:16.763754 ip-10-0-132-64 kubenswrapper[2576]: E0425 00:18:16.763734 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523\": container with ID starting with 4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523 not found: ID does not exist" containerID="4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523" Apr 25 00:18:16.763799 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:16.763760 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523"} err="failed to get container status \"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523\": rpc error: code = NotFound desc = could not find container \"4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523\": container with ID starting with 4790534cdb4dd433b717c1c9a0da2bd47e083705fa7a085a278b407573fe1523 not found: ID does not exist" Apr 25 00:18:17.400591 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:17.400557 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" path="/var/lib/kubelet/pods/ce258e6b-bad3-41a8-81c4-90a9a3d84b5b/volumes" Apr 25 00:18:37.038480 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sv58q/must-gather-smqsc"] Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038838 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kube-rbac-proxy" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038849 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kube-rbac-proxy" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038866 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="storage-initializer" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038871 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="storage-initializer" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="storage-initializer" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038885 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="storage-initializer" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038891 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038897 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038906 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038911 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038922 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kube-rbac-proxy" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038927 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kube-rbac-proxy" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038980 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kube-rbac-proxy" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038990 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a9a267-5518-4d44-9461-fe6d2793a548" containerName="kserve-container" Apr 25 00:18:37.038989 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.038996 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kserve-container" Apr 25 00:18:37.039466 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.039004 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce258e6b-bad3-41a8-81c4-90a9a3d84b5b" containerName="kube-rbac-proxy" Apr 25 00:18:37.043552 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.043535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.046065 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.046037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"kube-root-ca.crt\"" Apr 25 00:18:37.046795 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.046769 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sv58q\"/\"default-dockercfg-h5ljq\"" Apr 25 00:18:37.046910 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.046769 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"openshift-service-ca.crt\"" Apr 25 00:18:37.047087 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.047063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/must-gather-smqsc"] Apr 25 00:18:37.121429 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.121404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-must-gather-output\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.121550 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.121443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsv8\" (UniqueName: \"kubernetes.io/projected/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-kube-api-access-mpsv8\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.222023 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.221991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsv8\" (UniqueName: \"kubernetes.io/projected/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-kube-api-access-mpsv8\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.222171 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.222099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-must-gather-output\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.222393 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.222374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-must-gather-output\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.229946 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.229925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsv8\" (UniqueName: \"kubernetes.io/projected/a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3-kube-api-access-mpsv8\") pod \"must-gather-smqsc\" (UID: \"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3\") " pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.368448 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.368369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/must-gather-smqsc" Apr 25 00:18:37.496410 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.496384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/must-gather-smqsc"] Apr 25 00:18:37.498147 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:18:37.498102 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33dc1b8_1b5c_4c08_8451_3ed4bd3e23a3.slice/crio-090934f9745da6a76ddcb48824ae0dc8121b167b010ee4aa10527a9590c54c6e WatchSource:0}: Error finding container 090934f9745da6a76ddcb48824ae0dc8121b167b010ee4aa10527a9590c54c6e: Status 404 returned error can't find the container with id 090934f9745da6a76ddcb48824ae0dc8121b167b010ee4aa10527a9590c54c6e Apr 25 00:18:37.815363 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:37.815323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/must-gather-smqsc" event={"ID":"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3","Type":"ContainerStarted","Data":"090934f9745da6a76ddcb48824ae0dc8121b167b010ee4aa10527a9590c54c6e"} Apr 25 00:18:38.827480 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:38.827436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/must-gather-smqsc" event={"ID":"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3","Type":"ContainerStarted","Data":"99e2166f0d95c85f06c49428fc7f4c8262dc87bd3295b419cc2fc9d421e41d4a"} Apr 25 00:18:38.827480 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:38.827482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/must-gather-smqsc" event={"ID":"a33dc1b8-1b5c-4c08-8451-3ed4bd3e23a3","Type":"ContainerStarted","Data":"0775b0004ecd815b089537647dc878056b7006121cc587befb10face2fbe9acf"} Apr 25 00:18:38.844684 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:38.844628 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sv58q/must-gather-smqsc" podStartSLOduration=0.835302798 podStartE2EDuration="1.844614885s" podCreationTimestamp="2026-04-25 00:18:37 +0000 UTC" firstStartedPulling="2026-04-25 00:18:37.499902188 +0000 UTC m=+1478.660851807" lastFinishedPulling="2026-04-25 00:18:38.509214272 +0000 UTC m=+1479.670163894" observedRunningTime="2026-04-25 00:18:38.842113914 +0000 UTC m=+1480.003063555" watchObservedRunningTime="2026-04-25 00:18:38.844614885 +0000 UTC m=+1480.005564525" Apr 25 00:18:39.913738 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:39.913708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pxtxr_e751c0d7-8e62-4d09-bbcb-7987a6bb0be2/global-pull-secret-syncer/0.log" Apr 25 00:18:40.011507 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:40.011455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5xdns_607faeff-0f25-43eb-a633-127b915c9238/konnectivity-agent/0.log" Apr 25 00:18:40.089303 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:40.089277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-64.ec2.internal_d7a770aaafd6135ccc34734060ee4e87/haproxy/0.log" Apr 25 00:18:43.043526 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.043490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/alertmanager/0.log" Apr 25 00:18:43.074293 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.074255 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/config-reloader/0.log" Apr 25 00:18:43.104020 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.103980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/kube-rbac-proxy-web/0.log" Apr 25 00:18:43.128434 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.128394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/kube-rbac-proxy/0.log" Apr 25 00:18:43.161173 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.161116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/kube-rbac-proxy-metric/0.log" Apr 25 00:18:43.194366 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.194331 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/prom-label-proxy/0.log" Apr 25 00:18:43.216625 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.216590 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_787e6d5d-d8c4-411c-8182-e1c27fa743f8/init-config-reloader/0.log" Apr 25 00:18:43.420270 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.420128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6hzlt_504b17af-7b61-4156-bbc0-1a95c7919e51/monitoring-plugin/0.log" Apr 25 00:18:43.452783 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.452744 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2n5fc_230b41dd-71de-46ef-9417-3bcfa4d0c7ef/node-exporter/0.log" Apr 25 00:18:43.477723 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.477667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2n5fc_230b41dd-71de-46ef-9417-3bcfa4d0c7ef/kube-rbac-proxy/0.log" Apr 25 00:18:43.507390 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.507353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2n5fc_230b41dd-71de-46ef-9417-3bcfa4d0c7ef/init-textfile/0.log" Apr 25 00:18:43.681675 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.681599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l75g_c2beaa31-74bb-4e8e-a7d9-b8b39dc02466/kube-rbac-proxy-main/0.log" Apr 25 00:18:43.702156 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.702120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l75g_c2beaa31-74bb-4e8e-a7d9-b8b39dc02466/kube-rbac-proxy-self/0.log" Apr 25 00:18:43.723736 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.723707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l75g_c2beaa31-74bb-4e8e-a7d9-b8b39dc02466/openshift-state-metrics/0.log" Apr 25 00:18:43.968741 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.968597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-pcm5d_29f3e42e-6f53-4dd6-b2bd-8b3b379a2d0b/prometheus-operator-admission-webhook/0.log" Apr 25 00:18:43.994945 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:43.994904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667c8bcf6d-xjsqz_3f60b52c-16eb-4774-8e73-fb65b6922ee9/telemeter-client/0.log" Apr 25 00:18:44.019041 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.019014 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667c8bcf6d-xjsqz_3f60b52c-16eb-4774-8e73-fb65b6922ee9/reload/0.log" Apr 25 00:18:44.041751 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.041712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-667c8bcf6d-xjsqz_3f60b52c-16eb-4774-8e73-fb65b6922ee9/kube-rbac-proxy/0.log" Apr 25 00:18:44.068297 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.068269 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/thanos-query/0.log" Apr 25 00:18:44.089193 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.089157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/kube-rbac-proxy-web/0.log" Apr 25 00:18:44.108988 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.108960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/kube-rbac-proxy/0.log" Apr 25 00:18:44.130498 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.130468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/prom-label-proxy/0.log" Apr 25 00:18:44.150137 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.150077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/kube-rbac-proxy-rules/0.log" Apr 25 00:18:44.176525 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:44.176487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-696f599485-86lrs_cb77e695-4f0e-4120-b4e4-479cb80be577/kube-rbac-proxy-metrics/0.log" Apr 25 00:18:45.323810 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:45.323775 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-sb9nb_fe581fd0-91fe-46d8-be3f-cc2be31f574f/networking-console-plugin/0.log" Apr 25 00:18:45.830071 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:45.830038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:18:45.839085 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:45.839047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/2.log" Apr 25 00:18:46.232258 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:46.232002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cb59d7578-sk9dg_ec5ce6b6-b2f8-4931-972c-963fb1274140/console/0.log" Apr 25 00:18:46.659949 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:46.659909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-99fjd_85bb6fee-0df0-467b-85aa-d617cbda12e0/volume-data-source-validator/0.log" Apr 25 00:18:47.320493 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.320446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4hmb_c18a83d5-7d20-4b99-9a28-d4fea36360b1/dns/0.log" Apr 25 00:18:47.326933 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.326909 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7"] Apr 25 00:18:47.337022 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.336990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.339953 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.339922 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7"] Apr 25 00:18:47.357898 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.357871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4hmb_c18a83d5-7d20-4b99-9a28-d4fea36360b1/kube-rbac-proxy/0.log" Apr 25 00:18:47.418640 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.418604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-podres\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.418852 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.418668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-sys\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.418852 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.418721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-lib-modules\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.418852 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.418763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-proc\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.418852 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.418807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8978j\" (UniqueName: \"kubernetes.io/projected/925732b5-0014-432c-8bc1-419cba84c2fd-kube-api-access-8978j\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.441778 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.441748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gfw9h_84d61329-00aa-4270-b9d1-b1f736da6f64/dns-node-resolver/0.log" Apr 25 00:18:47.519557 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-lib-modules\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.519788 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-proc\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.519788 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8978j\" (UniqueName: \"kubernetes.io/projected/925732b5-0014-432c-8bc1-419cba84c2fd-kube-api-access-8978j\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.519788 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-podres\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.519788 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-sys\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.520018 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.519862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-sys\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.520315 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.520288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-proc\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.520427 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.520361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-podres\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.520427 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.520381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925732b5-0014-432c-8bc1-419cba84c2fd-lib-modules\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.528805 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.528782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8978j\" (UniqueName: \"kubernetes.io/projected/925732b5-0014-432c-8bc1-419cba84c2fd-kube-api-access-8978j\") pod \"perf-node-gather-daemonset-bvws7\" (UID: \"925732b5-0014-432c-8bc1-419cba84c2fd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.652354 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.652258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:47.800120 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.800085 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7"] Apr 25 00:18:47.802966 ip-10-0-132-64 kubenswrapper[2576]: W0425 00:18:47.802913 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod925732b5_0014_432c_8bc1_419cba84c2fd.slice/crio-300bdab625d53fe1507e3379f49b6d27af98fbd6ba4c5ab3db176d4b9cc42157 WatchSource:0}: Error finding container 300bdab625d53fe1507e3379f49b6d27af98fbd6ba4c5ab3db176d4b9cc42157: Status 404 returned error can't find the container with id 300bdab625d53fe1507e3379f49b6d27af98fbd6ba4c5ab3db176d4b9cc42157 Apr 25 00:18:47.867454 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.867424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" event={"ID":"925732b5-0014-432c-8bc1-419cba84c2fd","Type":"ContainerStarted","Data":"300bdab625d53fe1507e3379f49b6d27af98fbd6ba4c5ab3db176d4b9cc42157"} Apr 25 00:18:47.937583 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:47.937536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cm667_ba532e45-f2da-4349-bf2b-680421e6b958/node-ca/0.log" Apr 25 00:18:48.872218 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:48.872182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" event={"ID":"925732b5-0014-432c-8bc1-419cba84c2fd","Type":"ContainerStarted","Data":"e65192bb2d5bdb595b24fa3e0a5010b6d61c29d6d1a0c8d231cf5097ee94ed1f"} Apr 25 00:18:48.872667 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:48.872235 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:48.887517 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:48.887462 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" podStartSLOduration=1.887443239 podStartE2EDuration="1.887443239s" podCreationTimestamp="2026-04-25 00:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:18:48.885968144 +0000 UTC m=+1490.046917814" watchObservedRunningTime="2026-04-25 00:18:48.887443239 +0000 UTC m=+1490.048392880" Apr 25 00:18:49.040080 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.040053 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gb2jv_14193e4c-7287-4686-892b-3006e6c02a97/serve-healthcheck-canary/0.log" Apr 25 00:18:49.373294 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.373256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sfnrq_d5c69f72-f063-42af-a243-45f740a1ea73/insights-operator/0.log" Apr 25 00:18:49.374125 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.374107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sfnrq_d5c69f72-f063-42af-a243-45f740a1ea73/insights-operator/1.log" Apr 25 00:18:49.580664 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.580630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w5st9_086f85ea-a11f-451b-94e1-ff8da489f053/kube-rbac-proxy/0.log" Apr 25 00:18:49.600848 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.600806 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w5st9_086f85ea-a11f-451b-94e1-ff8da489f053/exporter/0.log" Apr 25 00:18:49.621677 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:49.621655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w5st9_086f85ea-a11f-451b-94e1-ff8da489f053/extractor/0.log" Apr 25 00:18:51.676771 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:51.676744 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-55x25_9026da9a-5b30-4323-9828-0c7956b74eff/server/0.log" Apr 25 00:18:51.773292 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:51.773241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-5rhzw_815ed346-ab88-4b9d-9f6b-a56172cc8a9d/manager/0.log" Apr 25 00:18:51.791612 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:51.791591 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-xh7xg_2ae7c771-1c98-4e8d-bbbc-3c579d314ba2/s3-init/0.log" Apr 25 00:18:51.816210 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:51.816186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-st7fs_1c270bee-5bf3-45f2-9b85-e043a21fcca6/seaweedfs/0.log" Apr 25 00:18:54.888844 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:54.888816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-bvws7" Apr 25 00:18:55.595868 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:55.595838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-66k59_14e1f6a5-afdd-48c3-8639-9d32f9e1b10b/migrator/0.log" Apr 25 00:18:55.620194 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:55.620161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-66k59_14e1f6a5-afdd-48c3-8639-9d32f9e1b10b/graceful-termination/0.log" Apr 25 00:18:56.006077 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:56.006036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gtxb2_34e5790d-4147-4de2-8280-8a4d156daee6/kube-storage-version-migrator-operator/1.log" Apr 25 00:18:56.007163 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:56.007118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gtxb2_34e5790d-4147-4de2-8280-8a4d156daee6/kube-storage-version-migrator-operator/0.log" Apr 25 00:18:57.036272 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.036233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4ql4n_3fb990ac-0afa-4098-9aa0-0178a341f1cc/kube-multus/0.log" Apr 25 00:18:57.090195 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.090168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/kube-multus-additional-cni-plugins/0.log" Apr 25 00:18:57.113073 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.113047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/egress-router-binary-copy/0.log" Apr 25 00:18:57.133733 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.133683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/cni-plugins/0.log" Apr 25 00:18:57.152216 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.152150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/bond-cni-plugin/0.log" Apr 25 00:18:57.174418 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.174356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/routeoverride-cni/0.log" Apr 25 00:18:57.193839 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.193804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/whereabouts-cni-bincopy/0.log" Apr 25 00:18:57.212705 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.212657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqqcb_f788507a-76a8-4714-8f6e-bf17c2e1c40a/whereabouts-cni/0.log" Apr 25 00:18:57.575967 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.575928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wrw7v_a4df8649-8216-4ed9-b023-a6de8b027cd5/network-metrics-daemon/0.log" Apr 25 00:18:57.593749 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:57.593724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wrw7v_a4df8649-8216-4ed9-b023-a6de8b027cd5/kube-rbac-proxy/0.log" Apr 25 00:18:58.647490 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.647373 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-controller/0.log" Apr 25 00:18:58.666638 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.666610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:18:58.674631 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.674609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/1.log" Apr 25 00:18:58.692224 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.692195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/kube-rbac-proxy-node/0.log" Apr 25 00:18:58.711863 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.711837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:18:58.733472 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.733430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/northd/0.log" Apr 25 00:18:58.752178 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.752152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/nbdb/0.log" Apr 25 00:18:58.771390 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.771358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/sbdb/0.log" Apr 25 00:18:58.875627 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:58.875595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovnkube-controller/0.log" Apr 25 00:18:59.417167 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:59.417084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:18:59.417321 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:59.417163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bcgjj_8ccca75f-9d61-4cbb-bc55-f033f88df8c6/console-operator/1.log" Apr 25 00:18:59.421774 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:59.421755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:18:59.422000 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:18:59.421981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj7ls_864575cd-867d-4ff1-99fd-72319ad03b97/ovn-acl-logging/0.log" Apr 25 00:19:00.123445 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:00.123415 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9l8ks_657f2810-9fef-43b7-825e-f2573c428db1/check-endpoints/0.log" Apr 25 00:19:00.167454 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:00.167425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-p279k_0badefdd-5292-410f-94d9-30bdbec0d66d/network-check-target-container/0.log" Apr 25 00:19:01.069811 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:01.069777 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5ddrw_a2f3e825-c2e5-44d7-9f59-45dc7ea2eba2/iptables-alerter/0.log" Apr 25 00:19:01.713970 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:01.713943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zgs5z_2070654b-e1dc-4cd4-8770-c6f66f355061/tuned/0.log" Apr 25 00:19:03.390213 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:03.390185 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-7rhm4_1d316ecc-7ca1-4dcd-a561-d363f811198c/cluster-samples-operator/0.log" Apr 25 00:19:03.405094 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:03.405073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-7rhm4_1d316ecc-7ca1-4dcd-a561-d363f811198c/cluster-samples-operator-watch/0.log" Apr 25 00:19:04.362039 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:04.362006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-zpjq8_bc0a1f9d-aade-4d80-a5b8-fbc8542431a7/service-ca-operator/1.log" Apr 25 00:19:04.363018 ip-10-0-132-64 kubenswrapper[2576]: I0425 00:19:04.362992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-zpjq8_bc0a1f9d-aade-4d80-a5b8-fbc8542431a7/service-ca-operator/0.log"