Apr 20 20:04:52.154526 ip-10-0-140-19 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:04:52.522591 ip-10-0-140-19 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:04:52.522591 ip-10-0-140-19 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:04:52.522591 ip-10-0-140-19 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:04:52.522591 ip-10-0-140-19 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:04:52.522591 ip-10-0-140-19 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:04:52.523619 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.523549 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:04:52.528381 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528368 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:04:52.528381 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528381 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528384 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528387 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528390 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528401 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528404 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528408 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528410 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528413 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528416 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528419 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528421 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528424 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528426 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528429 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528432 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528434 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528437 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528439 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528442 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:04:52.528443 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528444 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528447 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528450 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528453 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528456 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528459 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528461 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528464 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528466 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528469 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528471 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528474 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528476 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528479 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528481 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528484 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528491 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528495 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528497 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528499 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:04:52.528937 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528502 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528504 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528507 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528509 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528512 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528514 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528517 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528519 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528522 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528524 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528527 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528530 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528532 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528535 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528538 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528541 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528544 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528547 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528550 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528552 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:04:52.529783 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528555 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528557 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528561 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528563 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528566 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528569 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528571 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528574 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528576 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528579 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528582 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528585 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528589 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528593 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528596 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528599 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528602 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528605 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528609 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:04:52.530567 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528613 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528616 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528618 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528621 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528624 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.528627 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530497 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530509 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530512 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530515 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530519 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530522 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530525 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530527 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530530 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530533 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530535 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530538 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530541 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:04:52.531074 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530544 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530546 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530549 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530551 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530554 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530557 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530559 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530562 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530564 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530567 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530570 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530573 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530576 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530580 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530582 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530585 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530588 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530590 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530593 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530595 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:04:52.531562 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530599 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530603 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530606 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530608 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530611 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530614 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530617 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530629 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530633 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530635 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530638 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530640 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530643 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530646 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530648 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530651 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530653 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530656 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530658 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530661 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:04:52.532058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530663 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530666 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530669 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530671 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530674 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530678 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530682 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530686 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530689 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530692 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530695 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530698 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530702 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530705 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530707 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530710 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530713 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530715 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530718 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:04:52.532578 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530721 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530723 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530726 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530730 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530733 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530736 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530740 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530742 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530745 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530747 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530750 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530753 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530755 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.530758 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530829 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530835 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530842 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530846 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530850 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530854 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530858 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:04:52.533035 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530863 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530867 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530870 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530873 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530877 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530880 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530884 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530886 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530889 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530892 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530895 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530898 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530902 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530905 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530908 2575 flags.go:64] FLAG: --config-dir="" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530910 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530914 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530918 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530921 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530924 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530928 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530931 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530934 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530936 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530939 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:04:52.533567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530942 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530946 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530949 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530952 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530955 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530958 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530960 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530965 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530968 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530971 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530974 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530978 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530981 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530984 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530987 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530991 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530994 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530996 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.530999 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531002 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531005 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531008 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531011 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531015 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531018 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:04:52.534166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531021 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531025 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531028 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531031 2575 flags.go:64] FLAG: --help="false" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531034 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531037 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531040 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531042 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531046 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531049 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531052 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531054 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531057 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531060 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531063 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531066 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531069 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531072 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531075 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531079 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531082 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531084 2575 flags.go:64] FLAG: --lock-file="" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531087 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531090 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:04:52.534895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531093 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531098 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531101 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531104 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531106 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531109 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531113 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531115 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531118 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531123 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531127 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531131 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531134 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531136 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531139 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531142 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531145 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531148 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531151 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531158 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531161 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531164 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531167 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:04:52.535499 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531170 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531175 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531178 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531181 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531184 2575 flags.go:64] FLAG: --port="10250" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531187 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531190 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c9578cc765f8fed6" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531194 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531197 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531200 2575 flags.go:64] FLAG: --register-node="true" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531203 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531205 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531209 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531212 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531215 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531217 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531221 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531224 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531227 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531230 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531233 2575 flags.go:64] FLAG: --runonce="false" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531236 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531239 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531242 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531245 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531264 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:04:52.536058 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531268 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531271 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531274 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531277 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531279 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531282 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531285 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531288 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531291 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531296 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531301 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531304 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531307 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531311 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531314 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531317 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531320 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531323 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531326 2575 flags.go:64] FLAG: --v="2" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531329 2575 flags.go:64] FLAG: --version="false" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531333 2575 flags.go:64] FLAG: --vmodule="" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531337 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.531340 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531423 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:04:52.536712 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531427 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531431 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531434 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531438 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531441 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531444 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531447 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531450 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531453 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531455 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531458 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531461 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531463 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531466 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531468 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531471 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531473 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531477 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531480 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531482 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:04:52.537309 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531485 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531488 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531490 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531493 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531495 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531498 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531501 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531503 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531506 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531509 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531511 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531514 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531517 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531520 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531522 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531524 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531531 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531534 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531537 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531539 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:04:52.537806 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531542 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531544 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531547 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531549 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531552 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531554 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531557 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531559 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531562 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531566 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531569 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531572 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531575 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531577 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531580 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531583 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531587 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531590 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531593 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:04:52.538342 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531596 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531598 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531601 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531604 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531606 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531609 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531612 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531614 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531617 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531620 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531623 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531625 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531628 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531630 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531633 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531635 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531638 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531640 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531643 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:04:52.538824 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531645 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531648 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531650 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531655 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531657 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531660 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.531663 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.532220 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:04:52.539328 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.539320 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.539337 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539389 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539395 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539398 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539402 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539405 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539408 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539411 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539414 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539417 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539419 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539422 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539425 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539427 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539431 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539434 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539437 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539439 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539442 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:04:52.539535 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539444 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539447 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539449 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539452 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539455 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539457 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539459 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539462 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539465 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539467 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539470 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539472 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539475 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539478 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539481 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539484 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539486 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539489 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539492 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:04:52.540032 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539494 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539497 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539501 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539504 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539506 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539509 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539511 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539514 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539516 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539520 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539522 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539525 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539527 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539529 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539532 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539534 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539537 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539540 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539542 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539545 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:04:52.540528 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539548 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539550 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539553 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539555 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539558 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539561 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539564 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539567 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539569 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539572 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539575 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539577 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539581 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539585 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539587 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539590 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539593 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539596 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539598 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539601 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:04:52.541023 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539603 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539606 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539609 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539611 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539614 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539616 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539619 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539621 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539624 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.539629 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539721 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539725 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539728 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539731 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539734 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539736 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:04:52.541536 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539739 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539742 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539745 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539748 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539751 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539754 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539756 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539759 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539761 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539764 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539766 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539769 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539771 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539774 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539777 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539779 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539782 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539784 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539787 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539795 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:04:52.541924 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539798 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539800 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539803 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539805 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539808 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539810 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539813 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539815 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539818 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539820 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539824 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539828 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539831 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539834 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539837 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539839 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539842 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539844 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539847 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:04:52.542432 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539849 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539852 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539854 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539857 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539859 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539861 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539864 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539866 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539869 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539871 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539874 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539876 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539879 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539882 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539884 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539887 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539889 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539892 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539894 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:04:52.542896 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539898 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539901 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539904 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539906 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539909 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539911 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539914 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539917 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539919 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539922 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539924 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539927 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539929 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539932 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539934 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539937 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539939 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539942 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539944 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539947 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:04:52.543369 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539949 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:52.539952 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.539956 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.540585 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.542831 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.543592 2575 server.go:1019] "Starting client certificate rotation" Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.543693 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:04:52.543890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.543725 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:04:52.566612 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.566597 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:04:52.570162 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.570144 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:04:52.585837 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.585818 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:04:52.591055 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.591040 2575 log.go:25] "Validated CRI v1 image API" Apr 20 20:04:52.592637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.592617 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:04:52.595429 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.595412 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:04:52.595870 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.595848 2575 fs.go:135] Filesystem UUIDs: map[046fee70-5ed9-43b5-a98e-6ec02d58039b:/dev/nvme0n1p3 12553b11-4392-4e3a-bc92-07babe42803b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 20:04:52.595930 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.595869 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:04:52.601214 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.601117 2575 manager.go:217] Machine: {Timestamp:2026-04-20 20:04:52.599943954 +0000 UTC m=+0.346864016 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098688 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27137620bc5fa4a8c0a5d5ff82b608 SystemUUID:ec271376-20bc-5fa4-a8c0-a5d5ff82b608 BootID:07b2ae34-6f04-4cb1-ac6d-d1d27d478808 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a6:49:eb:dc:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a6:49:eb:dc:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:9f:01:3d:a5:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:04:52.601214 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.601207 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:04:52.601339 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.601284 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:04:52.603211 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.603188 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:04:52.603359 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.603215 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-19.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:04:52.603402 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.603370 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:04:52.603402 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.603379 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:04:52.603402 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.603391 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:04:52.604278 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.604268 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:04:52.604975 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.604965 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:04:52.605075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.605067 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:04:52.606779 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.606769 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:04:52.606828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.606790 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:04:52.606828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.606800 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:04:52.606828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.606809 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:04:52.606828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.606817 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:04:52.607725 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.607713 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:04:52.607770 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.607731 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:04:52.610101 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.610083 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:04:52.611652 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.611631 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:04:52.612858 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612844 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612864 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612874 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612894 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612903 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612911 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612919 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612927 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:04:52.612938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612936 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:04:52.613198 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612945 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:04:52.613198 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612968 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:04:52.613198 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.612981 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:04:52.613739 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.613728 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:04:52.613785 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.613741 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:04:52.617056 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.617041 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:04:52.617142 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.617079 2575 server.go:1295] "Started kubelet" Apr 20 20:04:52.617203 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.617175 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:04:52.617566 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.617509 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:04:52.617653 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.617582 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:04:52.617796 ip-10-0-140-19 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:04:52.619356 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.619339 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:04:52.619784 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.619761 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:04:52.619784 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.619756 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:04:52.619908 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.619798 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-19.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:04:52.619981 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.619969 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:04:52.623824 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.623804 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:04:52.624287 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.624247 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:04:52.626846 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.626774 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:04:52.627061 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627043 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:04:52.627143 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627065 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:04:52.627206 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.627188 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:52.627307 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627283 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:04:52.627307 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627302 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:04:52.627784 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627764 2575 factory.go:55] Registering systemd factory Apr 20 20:04:52.627863 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.627787 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:04:52.628119 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628102 2575 factory.go:153] Registering CRI-O factory Apr 20 20:04:52.628204 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628122 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 20:04:52.628204 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628195 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:04:52.628335 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628215 2575 factory.go:103] Registering Raw factory Apr 20 20:04:52.628335 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628231 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 20:04:52.628933 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.628786 2575 manager.go:319] Starting recovery of all containers Apr 20 20:04:52.629595 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.629556 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:04:52.630325 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.630302 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:04:52.630803 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.630765 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:04:52.633331 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.629842 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-19.ec2.internal.18a82953bc5f635d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-19.ec2.internal,UID:ip-10-0-140-19.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-19.ec2.internal,},FirstTimestamp:2026-04-20 20:04:52.617053021 +0000 UTC m=+0.363973087,LastTimestamp:2026-04-20 20:04:52.617053021 +0000 UTC m=+0.363973087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-19.ec2.internal,}" Apr 20 20:04:52.639921 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.639820 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s4gmv" Apr 20 20:04:52.640959 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.640944 2575 manager.go:324] Recovery completed Apr 20 20:04:52.645022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.645010 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.647084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.647070 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s4gmv" Apr 20 20:04:52.653643 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.653629 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.653724 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.653655 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.653724 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.653667 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.654152 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.654140 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:04:52.654192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.654152 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:04:52.654192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.654168 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:04:52.655625 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.655568 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-19.ec2.internal.18a82953be8db34c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-19.ec2.internal,UID:ip-10-0-140-19.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-19.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-19.ec2.internal,},FirstTimestamp:2026-04-20 20:04:52.653642572 +0000 UTC m=+0.400562634,LastTimestamp:2026-04-20 20:04:52.653642572 +0000 UTC m=+0.400562634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-19.ec2.internal,}" Apr 20 20:04:52.657225 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.657214 2575 policy_none.go:49] "None policy: Start" Apr 20 20:04:52.657285 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.657230 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:04:52.657285 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.657240 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693426 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.693450 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693458 2575 server.go:85] "Starting device plugin registration server" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693648 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693660 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693764 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693844 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.693853 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.694277 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:04:52.713108 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.694312 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:52.725106 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.725081 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:04:52.726211 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.726193 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:04:52.726311 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.726221 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:04:52.726311 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.726237 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:04:52.726311 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.726244 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:04:52.726311 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.726297 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:04:52.728983 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.728968 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:52.794777 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.794727 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.795706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.795690 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.795793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.795722 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.795793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.795736 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.795793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.795763 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.804761 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.804747 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.804823 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.804767 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-19.ec2.internal\": node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:52.817960 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.817941 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:52.826659 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.826640 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal"] Apr 20 20:04:52.826726 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.826711 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.827537 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.827518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.827597 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.827547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.828733 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.828719 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.828800 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.828743 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.828800 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.828754 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.830957 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.830944 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.831087 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831075 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.831128 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831101 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.831839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831821 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.831839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831829 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.831939 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831844 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.831939 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831852 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.831939 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831857 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.831939 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.831865 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.833975 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.833962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.834024 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.833986 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:04:52.834733 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.834718 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:04:52.834814 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.834748 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:04:52.834814 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.834759 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:04:52.859812 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.859792 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-19.ec2.internal\" not found" node="ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.865652 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.865633 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-19.ec2.internal\" not found" node="ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.918540 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:52.918512 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:52.927889 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.927869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.927941 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.927901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.927941 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.927936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:52.928005 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:52.927959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91c5ba0291d6cbe73c868f71f004e6a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal\" (UID: \"b91c5ba0291d6cbe73c868f71f004e6a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.018976 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.018950 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.028418 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.028399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a1223cbd65770cd535fe3d6f40564646-config\") pod \"kube-apiserver-proxy-ip-10-0-140-19.ec2.internal\" (UID: \"a1223cbd65770cd535fe3d6f40564646\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.119845 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.119793 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.129306 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.129279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a1223cbd65770cd535fe3d6f40564646-config\") pod \"kube-apiserver-proxy-ip-10-0-140-19.ec2.internal\" (UID: \"a1223cbd65770cd535fe3d6f40564646\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.129413 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.129324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a1223cbd65770cd535fe3d6f40564646-config\") pod \"kube-apiserver-proxy-ip-10-0-140-19.ec2.internal\" (UID: \"a1223cbd65770cd535fe3d6f40564646\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.161408 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.161387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.167814 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.167795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.220264 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.220236 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.320871 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.320845 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.421516 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.421454 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.434906 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.434883 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:53.522071 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.522046 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.543588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.543562 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:04:53.543988 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.543694 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:04:53.543988 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.543728 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:04:53.622636 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:53.622606 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-19.ec2.internal\" not found" Apr 20 20:04:53.624000 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.623972 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:04:53.633052 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.633030 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:53.639761 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.639740 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:04:53.649033 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.648995 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:59:52 +0000 UTC" deadline="2027-11-29 15:04:29.810615513 +0000 UTC" Apr 20 20:04:53.649033 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.649029 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14106h59m36.161589915s" Apr 20 20:04:53.661375 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.661352 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-726vr" Apr 20 20:04:53.667339 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.667322 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-726vr" Apr 20 20:04:53.724984 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.724961 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.734960 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.734942 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:04:53.735685 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.735673 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" Apr 20 20:04:53.749203 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.749182 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:04:53.809976 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:53.809951 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91c5ba0291d6cbe73c868f71f004e6a.slice/crio-47f9756547b22e82c1b3f7258bbeaaedcc2b959a5c8c83c556bb5e82e9394f6f WatchSource:0}: Error finding container 47f9756547b22e82c1b3f7258bbeaaedcc2b959a5c8c83c556bb5e82e9394f6f: Status 404 returned error can't find the container with id 47f9756547b22e82c1b3f7258bbeaaedcc2b959a5c8c83c556bb5e82e9394f6f Apr 20 20:04:53.810211 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:53.810198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1223cbd65770cd535fe3d6f40564646.slice/crio-431a90d4bb936f4130ac5d38f0d2be96b9668217748077420f0e4f741c81cee4 WatchSource:0}: Error finding container 431a90d4bb936f4130ac5d38f0d2be96b9668217748077420f0e4f741c81cee4: Status 404 returned error can't find the container with id 431a90d4bb936f4130ac5d38f0d2be96b9668217748077420f0e4f741c81cee4 Apr 20 20:04:53.814597 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.814581 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:04:53.893823 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:53.893804 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:54.608312 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.608280 2575 apiserver.go:52] "Watching apiserver" Apr 20 20:04:54.615904 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.615876 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:04:54.616368 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.616344 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c","openshift-image-registry/node-ca-qcx6p","openshift-multus/multus-9bpfb","openshift-multus/multus-additional-cni-plugins-5x687","openshift-multus/network-metrics-daemon-gf2gx","openshift-network-diagnostics/network-check-target-rn94w","openshift-network-operator/iptables-alerter-bb4hw","kube-system/konnectivity-agent-shngg","openshift-cluster-node-tuning-operator/tuned-mnj4w","openshift-dns/node-resolver-rlhqx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-lwd67"] Apr 20 20:04:54.621543 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.621501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:54.621635 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.621575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:04:54.623739 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.623717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.624348 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.624001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.626267 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626087 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:04:54.626267 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626087 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.626874 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qlnrh\"" Apr 20 20:04:54.626874 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:04:54.626874 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626676 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.626874 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.626677 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7xljf\"" Apr 20 20:04:54.631892 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.628753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:04:54.631892 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.629776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.632486 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.632289 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.633651 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.633358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.633651 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.633429 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:04:54.633651 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.633596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.635340 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.635159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-74xqh\"" Apr 20 20:04:54.635340 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.635196 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:04:54.635498 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.635474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.636649 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.635075 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:04:54.636649 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.635910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.637505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.636976 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.637505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cni-binary-copy\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.637505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637036 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:04:54.637505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637041 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nh28c\"" Apr 20 20:04:54.637505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-socket-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.637812 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpl6h\" (UniqueName: \"kubernetes.io/projected/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kube-api-access-xpl6h\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.637889 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-multus-certs\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.637889 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.638010 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cnibin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638010 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:54.638010 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-agent-certs\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.638010 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.637998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-konnectivity-ca\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.638155 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-os-release\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638155 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-netns\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638155 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-etc-selinux\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.638155 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-sys-fs\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.638155 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-k8s-cni-cncf-io\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-bin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-multus\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-conf-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-daemon-config\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-etc-kubernetes\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9nj\" (UniqueName: \"kubernetes.io/projected/07203ee1-3321-4aa2-bcf0-1688aa2911e0-kube-api-access-8x9nj\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-registration-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.638386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-system-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-kubelet\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-device-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whq6t\" (UniqueName: \"kubernetes.io/projected/73708303-ce7f-455e-9004-696bd9c8fe9f-kube-api-access-whq6t\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/73708303-ce7f-455e-9004-696bd9c8fe9f-iptables-alerter-script\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-socket-dir-parent\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-hostroot\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.638772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.638662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73708303-ce7f-455e-9004-696bd9c8fe9f-host-slash\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.639322 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.639302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.641902 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.641530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.641902 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.641631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.642819 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.642285 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:04:54.642819 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.642327 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7q7bq\"" Apr 20 20:04:54.642819 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.642493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:04:54.643934 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.643915 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:04:54.644570 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.644297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.644657 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.644586 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.644813 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.644788 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.645200 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.645095 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.645545 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.645526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.645672 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.645656 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zjqw7\"" Apr 20 20:04:54.645864 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.645817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s66wg\"" Apr 20 20:04:54.646642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.646583 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.646642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.646598 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5pn6f\"" Apr 20 20:04:54.647359 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.647336 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.647449 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.647351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.649344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.649162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:04:54.649344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.649197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:04:54.649344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.649205 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:04:54.650221 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.649945 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:04:54.650221 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.649985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:04:54.650221 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.650189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:04:54.650476 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.650236 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s4fh9\"" Apr 20 20:04:54.668036 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.668011 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:59:53 +0000 UTC" deadline="2027-12-28 12:30:23.66702734 +0000 UTC" Apr 20 20:04:54.668134 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.668036 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14800h25m28.998994976s" Apr 20 20:04:54.730808 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.728528 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:04:54.732503 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.732456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" event={"ID":"b91c5ba0291d6cbe73c868f71f004e6a","Type":"ContainerStarted","Data":"47f9756547b22e82c1b3f7258bbeaaedcc2b959a5c8c83c556bb5e82e9394f6f"} Apr 20 20:04:54.733613 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.733588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" event={"ID":"a1223cbd65770cd535fe3d6f40564646","Type":"ContainerStarted","Data":"431a90d4bb936f4130ac5d38f0d2be96b9668217748077420f0e4f741c81cee4"} Apr 20 20:04:54.738851 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2nwv\" (UniqueName: \"kubernetes.io/projected/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-kube-api-access-z2nwv\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.738940 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.738940 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-tmp\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.738940 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-hosts-file\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-conf-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-etc-kubernetes\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.738985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-registration-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whq6t\" (UniqueName: \"kubernetes.io/projected/73708303-ce7f-455e-9004-696bd9c8fe9f-kube-api-access-whq6t\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-run\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.739084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-system-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-kubelet\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-device-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-host\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-config\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-system-cni-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4ct\" (UniqueName: \"kubernetes.io/projected/397c50ea-cc56-4ac2-a69e-090e94977ed9-kube-api-access-dw4ct\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/73708303-ce7f-455e-9004-696bd9c8fe9f-iptables-alerter-script\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysconfig\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-lib-modules\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrjn\" (UniqueName: \"kubernetes.io/projected/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-kube-api-access-4nrjn\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-netns\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73708303-ce7f-455e-9004-696bd9c8fe9f-host-slash\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ffcd701-9c8c-423c-adee-9e708c55207b-host\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.739426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-etc-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpl6h\" (UniqueName: \"kubernetes.io/projected/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kube-api-access-xpl6h\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-kubelet\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-ovn\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-env-overrides\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-slash\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-systemd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cnibin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-agent-certs\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-konnectivity-ca\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-bin\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-script-lib\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-os-release\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.739955 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73708303-ce7f-455e-9004-696bd9c8fe9f-host-slash\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-etc-selinux\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-var-lib-kubelet\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-etc-tuned\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-os-release\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-k8s-cni-cncf-io\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/73708303-ce7f-455e-9004-696bd9c8fe9f-iptables-alerter-script\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-bin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-multus\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-daemon-config\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-multus\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9nj\" (UniqueName: \"kubernetes.io/projected/07203ee1-3321-4aa2-bcf0-1688aa2911e0-kube-api-access-8x9nj\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f8cl\" (UniqueName: \"kubernetes.io/projected/0ffcd701-9c8c-423c-adee-9e708c55207b-kube-api-access-8f8cl\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-kubelet\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-conf\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-system-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.740839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovn-node-metrics-cert\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-os-release\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-etc-kubernetes\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-registration-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-var-lib-cni-bin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-device-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-conf-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cnibin\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.740988 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-systemd\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-node-log\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-k8s-cni-cncf-io\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.741576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-konnectivity-ca\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-daemon-config\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-socket-dir-parent\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-hostroot\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-etc-selinux\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-modprobe-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-systemd-units\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-socket-dir-parent\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cni-binary-copy\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-multus-cni-dir\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.741717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-hostroot\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-socket-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.742117 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ffcd701-9c8c-423c-adee-9e708c55207b-serviceca\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-kubernetes\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-sys\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-multus-certs\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-tmp-dir\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-netd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-cnibin\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-multus-certs\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89dxl\" (UniqueName: \"kubernetes.io/projected/de77e01d-c1e5-4a7e-99df-1261a9d21bed-kube-api-access-89dxl\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-socket-dir\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-netns\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-sys-fs\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07203ee1-3321-4aa2-bcf0-1688aa2911e0-host-run-netns\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd55f\" (UniqueName: \"kubernetes.io/projected/57392845-ccc8-4912-8291-1fd220cb1fee-kube-api-access-sd55f\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.742706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-var-lib-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.743397 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32a74b02-cb51-4f5d-91a3-63f0dac5b718-sys-fs\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.743397 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07203ee1-3321-4aa2-bcf0-1688aa2911e0-cni-binary-copy\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.743397 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-log-socket\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.743397 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.742840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.744985 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.744943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d199ee7-aff2-40a9-92ab-a7ddd2f94088-agent-certs\") pod \"konnectivity-agent-shngg\" (UID: \"6d199ee7-aff2-40a9-92ab-a7ddd2f94088\") " pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.745954 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.745915 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:04:54.745954 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.745943 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:04:54.745954 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.745955 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:54.746153 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.746024 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:04:55.245989952 +0000 UTC m=+2.992910004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:54.748772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.748746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whq6t\" (UniqueName: \"kubernetes.io/projected/73708303-ce7f-455e-9004-696bd9c8fe9f-kube-api-access-whq6t\") pod \"iptables-alerter-bb4hw\" (UID: \"73708303-ce7f-455e-9004-696bd9c8fe9f\") " pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.748888 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.748871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpl6h\" (UniqueName: \"kubernetes.io/projected/32a74b02-cb51-4f5d-91a3-63f0dac5b718-kube-api-access-xpl6h\") pod \"aws-ebs-csi-driver-node-zmp6c\" (UID: \"32a74b02-cb51-4f5d-91a3-63f0dac5b718\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.749660 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.749617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9nj\" (UniqueName: \"kubernetes.io/projected/07203ee1-3321-4aa2-bcf0-1688aa2911e0-kube-api-access-8x9nj\") pod \"multus-9bpfb\" (UID: \"07203ee1-3321-4aa2-bcf0-1688aa2911e0\") " pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.843730 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-ovn\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-env-overrides\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-slash\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-systemd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-bin\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-ovn\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-slash\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-systemd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.843873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-bin\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-script-lib\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-var-lib-kubelet\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-etc-tuned\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.843998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-os-release\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f8cl\" (UniqueName: \"kubernetes.io/projected/0ffcd701-9c8c-423c-adee-9e708c55207b-kube-api-access-8f8cl\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-conf\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-var-lib-kubelet\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-os-release\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovn-node-metrics-cert\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-conf\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844247 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-run-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-systemd\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-node-log\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-modprobe-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-systemd-units\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ffcd701-9c8c-423c-adee-9e708c55207b-serviceca\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-systemd\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-script-lib\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-kubernetes\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-sys\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-sys\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-node-log\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-tmp-dir\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-modprobe-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-systemd-units\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844599 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-binary-copy\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-netd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.844897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-kubernetes\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-cnibin\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-cnibin\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-env-overrides\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-cni-netd\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.844749 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89dxl\" (UniqueName: \"kubernetes.io/projected/de77e01d-c1e5-4a7e-99df-1261a9d21bed-kube-api-access-89dxl\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd55f\" (UniqueName: \"kubernetes.io/projected/57392845-ccc8-4912-8291-1fd220cb1fee-kube-api-access-sd55f\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:54.844824 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:04:55.344806782 +0000 UTC m=+3.091726836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-var-lib-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-tmp-dir\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-log-socket\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-var-lib-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.845768 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2nwv\" (UniqueName: \"kubernetes.io/projected/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-kube-api-access-z2nwv\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-log-socket\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.844994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ffcd701-9c8c-423c-adee-9e708c55207b-serviceca\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-tmp\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-hosts-file\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-run\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-hosts-file\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-host\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-config\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysctl-d\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-run\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-system-cni-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-host\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4ct\" (UniqueName: \"kubernetes.io/projected/397c50ea-cc56-4ac2-a69e-090e94977ed9-kube-api-access-dw4ct\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.846588 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysconfig\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-lib-modules\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/397c50ea-cc56-4ac2-a69e-090e94977ed9-system-cni-dir\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrjn\" (UniqueName: \"kubernetes.io/projected/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-kube-api-access-4nrjn\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-netns\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-lib-modules\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ffcd701-9c8c-423c-adee-9e708c55207b-host\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-run-netns\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-etc-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/57392845-ccc8-4912-8291-1fd220cb1fee-etc-sysconfig\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-kubelet\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-host-kubelet\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ffcd701-9c8c-423c-adee-9e708c55207b-host\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovnkube-config\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-etc-openvswitch\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.845979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/397c50ea-cc56-4ac2-a69e-090e94977ed9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.846723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-etc-tuned\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.847434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.847139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/57392845-ccc8-4912-8291-1fd220cb1fee-tmp\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.848186 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.847876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-ovn-node-metrics-cert\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.854078 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.854034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4ct\" (UniqueName: \"kubernetes.io/projected/397c50ea-cc56-4ac2-a69e-090e94977ed9-kube-api-access-dw4ct\") pod \"multus-additional-cni-plugins-5x687\" (UID: \"397c50ea-cc56-4ac2-a69e-090e94977ed9\") " pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.854429 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.854412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f8cl\" (UniqueName: \"kubernetes.io/projected/0ffcd701-9c8c-423c-adee-9e708c55207b-kube-api-access-8f8cl\") pod \"node-ca-qcx6p\" (UID: \"0ffcd701-9c8c-423c-adee-9e708c55207b\") " pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.854510 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.854451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89dxl\" (UniqueName: \"kubernetes.io/projected/de77e01d-c1e5-4a7e-99df-1261a9d21bed-kube-api-access-89dxl\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:54.854510 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.854501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd55f\" (UniqueName: \"kubernetes.io/projected/57392845-ccc8-4912-8291-1fd220cb1fee-kube-api-access-sd55f\") pod \"tuned-mnj4w\" (UID: \"57392845-ccc8-4912-8291-1fd220cb1fee\") " pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:54.855028 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.855013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2nwv\" (UniqueName: \"kubernetes.io/projected/f71db6ea-fd4d-4236-94af-1a0b3a3c623f-kube-api-access-z2nwv\") pod \"ovnkube-node-lwd67\" (UID: \"f71db6ea-fd4d-4236-94af-1a0b3a3c623f\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:54.855380 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.855364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrjn\" (UniqueName: \"kubernetes.io/projected/6c86c1d8-db05-4fb0-9906-f6e203ab0fc0-kube-api-access-4nrjn\") pod \"node-resolver-rlhqx\" (UID: \"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0\") " pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:54.941435 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.941379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" Apr 20 20:04:54.949420 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.949403 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:04:54.965027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.965004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9bpfb" Apr 20 20:04:54.971587 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.971568 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bb4hw" Apr 20 20:04:54.979141 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.979123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5x687" Apr 20 20:04:54.986641 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.986624 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qcx6p" Apr 20 20:04:54.995150 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:54.995132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" Apr 20 20:04:55.001621 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.001605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rlhqx" Apr 20 20:04:55.009170 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.009153 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:04:55.038624 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.038597 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:55.248300 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.248211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:55.248418 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.248369 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:04:55.248418 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.248384 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:04:55.248418 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.248393 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:55.248558 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.248434 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:04:56.248421374 +0000 UTC m=+3.995341432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:55.348657 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.348610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:55.348813 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.348755 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:55.348899 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:55.348833 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:04:56.348811566 +0000 UTC m=+4.095731622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:55.478802 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.478773 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:04:55.617569 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.617536 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57392845_ccc8_4912_8291_1fd220cb1fee.slice/crio-6a5436d7e5d72c6190d8242460681998f01f7ed9486dca79b9bba8c6cfc40eff WatchSource:0}: Error finding container 6a5436d7e5d72c6190d8242460681998f01f7ed9486dca79b9bba8c6cfc40eff: Status 404 returned error can't find the container with id 6a5436d7e5d72c6190d8242460681998f01f7ed9486dca79b9bba8c6cfc40eff Apr 20 20:04:55.622573 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.622544 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07203ee1_3321_4aa2_bcf0_1688aa2911e0.slice/crio-06b38276d322b2ef5193a3600b0a41ed29564690f7768376bbdef265693b3040 WatchSource:0}: Error finding container 06b38276d322b2ef5193a3600b0a41ed29564690f7768376bbdef265693b3040: Status 404 returned error can't find the container with id 06b38276d322b2ef5193a3600b0a41ed29564690f7768376bbdef265693b3040 Apr 20 20:04:55.623416 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.623383 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d199ee7_aff2_40a9_92ab_a7ddd2f94088.slice/crio-bb2512b31c39c0d3a5e4efedc448d79b04cde53574cda31d0aebfaa7607aed8f WatchSource:0}: Error finding container bb2512b31c39c0d3a5e4efedc448d79b04cde53574cda31d0aebfaa7607aed8f: Status 404 returned error can't find the container with id bb2512b31c39c0d3a5e4efedc448d79b04cde53574cda31d0aebfaa7607aed8f Apr 20 20:04:55.624037 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.623950 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a74b02_cb51_4f5d_91a3_63f0dac5b718.slice/crio-3e28add313cd036683c63323fa8ae3d16a0c8e5b670dd78444966f5ab7f3c555 WatchSource:0}: Error finding container 3e28add313cd036683c63323fa8ae3d16a0c8e5b670dd78444966f5ab7f3c555: Status 404 returned error can't find the container with id 3e28add313cd036683c63323fa8ae3d16a0c8e5b670dd78444966f5ab7f3c555 Apr 20 20:04:55.625085 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.624628 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397c50ea_cc56_4ac2_a69e_090e94977ed9.slice/crio-008261778c5c04e7700d8da2f2923bf1efe1a15b43cba2190b6e42f78e9f8664 WatchSource:0}: Error finding container 008261778c5c04e7700d8da2f2923bf1efe1a15b43cba2190b6e42f78e9f8664: Status 404 returned error can't find the container with id 008261778c5c04e7700d8da2f2923bf1efe1a15b43cba2190b6e42f78e9f8664 Apr 20 20:04:55.625703 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.625666 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71db6ea_fd4d_4236_94af_1a0b3a3c623f.slice/crio-0cb9ccb14a7ef5872b745623bf0cf91d129757038786d218df44be1e8ea88910 WatchSource:0}: Error finding container 0cb9ccb14a7ef5872b745623bf0cf91d129757038786d218df44be1e8ea88910: Status 404 returned error can't find the container with id 0cb9ccb14a7ef5872b745623bf0cf91d129757038786d218df44be1e8ea88910 Apr 20 20:04:55.626729 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.626659 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ffcd701_9c8c_423c_adee_9e708c55207b.slice/crio-86e3cebcc0dea3d361f05fab60994c7630c78c2938b633930f2afe4e00229024 WatchSource:0}: Error finding container 86e3cebcc0dea3d361f05fab60994c7630c78c2938b633930f2afe4e00229024: Status 404 returned error can't find the container with id 86e3cebcc0dea3d361f05fab60994c7630c78c2938b633930f2afe4e00229024 Apr 20 20:04:55.627540 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.627388 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c86c1d8_db05_4fb0_9906_f6e203ab0fc0.slice/crio-c8cb641e3e88a30e0b71afc9a6777b63f910dad2aee9b3fcb5ccde7231d7a9e4 WatchSource:0}: Error finding container c8cb641e3e88a30e0b71afc9a6777b63f910dad2aee9b3fcb5ccde7231d7a9e4: Status 404 returned error can't find the container with id c8cb641e3e88a30e0b71afc9a6777b63f910dad2aee9b3fcb5ccde7231d7a9e4 Apr 20 20:04:55.628353 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:04:55.628327 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73708303_ce7f_455e_9004_696bd9c8fe9f.slice/crio-614bc47dc8f5bbf8261d96b291dfc18cc270cfc8ba89f21e5626e0354aebf938 WatchSource:0}: Error finding container 614bc47dc8f5bbf8261d96b291dfc18cc270cfc8ba89f21e5626e0354aebf938: Status 404 returned error can't find the container with id 614bc47dc8f5bbf8261d96b291dfc18cc270cfc8ba89f21e5626e0354aebf938 Apr 20 20:04:55.669044 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.668883 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:59:53 +0000 UTC" deadline="2028-01-24 15:11:34.374328202 +0000 UTC" Apr 20 20:04:55.669044 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.669032 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15451h6m38.705300205s" Apr 20 20:04:55.736656 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.736633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" event={"ID":"a1223cbd65770cd535fe3d6f40564646","Type":"ContainerStarted","Data":"c4a90b95e273f9b3572b73d97b9828c51b0e65f10a4fc5b2b27df8080d0d314e"} Apr 20 20:04:55.737665 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.737638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rlhqx" event={"ID":"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0","Type":"ContainerStarted","Data":"c8cb641e3e88a30e0b71afc9a6777b63f910dad2aee9b3fcb5ccde7231d7a9e4"} Apr 20 20:04:55.738585 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.738551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-shngg" event={"ID":"6d199ee7-aff2-40a9-92ab-a7ddd2f94088","Type":"ContainerStarted","Data":"bb2512b31c39c0d3a5e4efedc448d79b04cde53574cda31d0aebfaa7607aed8f"} Apr 20 20:04:55.739538 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.739518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" event={"ID":"32a74b02-cb51-4f5d-91a3-63f0dac5b718","Type":"ContainerStarted","Data":"3e28add313cd036683c63323fa8ae3d16a0c8e5b670dd78444966f5ab7f3c555"} Apr 20 20:04:55.740391 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.740374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9bpfb" event={"ID":"07203ee1-3321-4aa2-bcf0-1688aa2911e0","Type":"ContainerStarted","Data":"06b38276d322b2ef5193a3600b0a41ed29564690f7768376bbdef265693b3040"} Apr 20 20:04:55.741229 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.741212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" event={"ID":"57392845-ccc8-4912-8291-1fd220cb1fee","Type":"ContainerStarted","Data":"6a5436d7e5d72c6190d8242460681998f01f7ed9486dca79b9bba8c6cfc40eff"} Apr 20 20:04:55.742016 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.741990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bb4hw" event={"ID":"73708303-ce7f-455e-9004-696bd9c8fe9f","Type":"ContainerStarted","Data":"614bc47dc8f5bbf8261d96b291dfc18cc270cfc8ba89f21e5626e0354aebf938"} Apr 20 20:04:55.742825 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.742796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qcx6p" event={"ID":"0ffcd701-9c8c-423c-adee-9e708c55207b","Type":"ContainerStarted","Data":"86e3cebcc0dea3d361f05fab60994c7630c78c2938b633930f2afe4e00229024"} Apr 20 20:04:55.743699 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.743681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"0cb9ccb14a7ef5872b745623bf0cf91d129757038786d218df44be1e8ea88910"} Apr 20 20:04:55.744634 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:55.744616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerStarted","Data":"008261778c5c04e7700d8da2f2923bf1efe1a15b43cba2190b6e42f78e9f8664"} Apr 20 20:04:56.257793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.257759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:56.257963 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.257945 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:04:56.258032 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.257971 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:04:56.258032 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.257985 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:56.258158 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.258042 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:04:58.258023253 +0000 UTC m=+6.004943318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:56.358695 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.358603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:56.358833 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.358746 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:56.358833 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.358805 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:04:58.358787818 +0000 UTC m=+6.105707882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:56.729734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.729656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:56.730136 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.729778 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:04:56.730207 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.730194 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:56.730328 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:56.730309 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:04:56.758426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.758390 2575 generic.go:358] "Generic (PLEG): container finished" podID="b91c5ba0291d6cbe73c868f71f004e6a" containerID="6ad3901ab909953d8cf34a49ecad7a1a296c90ce91250029fe30f53a02795665" exitCode=0 Apr 20 20:04:56.759296 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.759240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" event={"ID":"b91c5ba0291d6cbe73c868f71f004e6a","Type":"ContainerDied","Data":"6ad3901ab909953d8cf34a49ecad7a1a296c90ce91250029fe30f53a02795665"} Apr 20 20:04:56.776668 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:56.776621 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-19.ec2.internal" podStartSLOduration=3.7766036720000002 podStartE2EDuration="3.776603672s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:04:55.750432155 +0000 UTC m=+3.497352226" watchObservedRunningTime="2026-04-20 20:04:56.776603672 +0000 UTC m=+4.523523743" Apr 20 20:04:57.777176 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:57.777142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" event={"ID":"b91c5ba0291d6cbe73c868f71f004e6a","Type":"ContainerStarted","Data":"dae775463e07aadc4072881b6200fbbd74ef2067f42e494b7ae7eafa7e3a8ce8"} Apr 20 20:04:57.790695 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:57.790648 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-19.ec2.internal" podStartSLOduration=4.79063117 podStartE2EDuration="4.79063117s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:04:57.790288142 +0000 UTC m=+5.537208214" watchObservedRunningTime="2026-04-20 20:04:57.79063117 +0000 UTC m=+5.537551221" Apr 20 20:04:58.279652 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:58.279619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:58.280405 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.279839 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:04:58.280405 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.279861 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:04:58.280405 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.279874 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:58.280405 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.279931 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:02.27991085 +0000 UTC m=+10.026830904 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:04:58.380599 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:58.380565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:58.380842 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.380706 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:58.380842 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.380765 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:05:02.380748541 +0000 UTC m=+10.127668593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:04:58.727715 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:58.727640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:04:58.727715 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:04:58.727661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:04:58.727941 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.727767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:04:58.727941 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:04:58.727923 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:00.727550 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:00.727513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:00.727969 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:00.727513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:00.727969 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:00.727662 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:00.727969 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:00.727782 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:02.312162 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:02.311468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:02.312162 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.311647 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:02.312162 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.311666 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:02.312162 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.311678 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:02.312162 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.311752 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:10.311715492 +0000 UTC m=+18.058635560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:02.412062 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:02.412023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:02.412233 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.412151 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:02.412233 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.412233 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:05:10.412213783 +0000 UTC m=+18.159133838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:02.727452 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:02.727374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:02.727600 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.727507 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:02.728537 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:02.728514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:02.728653 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:02.728628 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:04.727045 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:04.727012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:04.727427 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:04.727018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:04.727427 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:04.727132 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:04.727427 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:04.727240 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:06.726596 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:06.726562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:06.727083 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:06.726696 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:06.727083 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:06.726759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:06.727083 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:06.726869 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:08.726589 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:08.726554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:08.727041 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:08.726677 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:08.727041 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:08.726757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:08.727041 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:08.726875 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:10.372509 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:10.372469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:10.372990 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.372606 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:10.372990 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.372620 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:10.372990 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.372630 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dfbl2 for pod openshift-network-diagnostics/network-check-target-rn94w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:10.372990 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.372685 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2 podName:b638336f-46b0-4174-be51-b9aa9a0f9341 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.372668439 +0000 UTC m=+34.119588491 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dfbl2" (UniqueName: "kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2") pod "network-check-target-rn94w" (UID: "b638336f-46b0-4174-be51-b9aa9a0f9341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:10.473822 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:10.473790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:10.473981 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.473911 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:10.473981 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.473972 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.473953751 +0000 UTC m=+34.220873812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:10.726856 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:10.726760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:10.727029 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:10.726763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:10.727029 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.726881 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:10.727673 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:10.726967 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:12.728177 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:12.728138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:12.728576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:12.728186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:12.728576 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:12.728244 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:12.728576 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:12.728362 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:13.804968 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.804773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rlhqx" event={"ID":"6c86c1d8-db05-4fb0-9906-f6e203ab0fc0","Type":"ContainerStarted","Data":"2b8bfe1a86428f6baec78edb21773f4bfbe709489930034ae8aab8873edc4342"} Apr 20 20:05:13.806045 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.806015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-shngg" event={"ID":"6d199ee7-aff2-40a9-92ab-a7ddd2f94088","Type":"ContainerStarted","Data":"30a2cf3fbbfadc81785602604e55c4065a7d2d8fd7d79605fb0b83ca6e6c3d5b"} Apr 20 20:05:13.807090 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.807068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" event={"ID":"32a74b02-cb51-4f5d-91a3-63f0dac5b718","Type":"ContainerStarted","Data":"ddb0922599e172862f45cf80f00ba583a629ef5a5f9a1065ad1040146551ec80"} Apr 20 20:05:13.808562 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.808502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9bpfb" event={"ID":"07203ee1-3321-4aa2-bcf0-1688aa2911e0","Type":"ContainerStarted","Data":"dd9a25165c97e2ef8eca3aafa314c0839bf75888c339fc487ede7157e19e09a5"} Apr 20 20:05:13.810302 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.810278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" event={"ID":"57392845-ccc8-4912-8291-1fd220cb1fee","Type":"ContainerStarted","Data":"940f28b12497ab0c94fb22dda0a22dcf4872913a81df38a95d7e88bf19b60ed4"} Apr 20 20:05:13.811736 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.811711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qcx6p" event={"ID":"0ffcd701-9c8c-423c-adee-9e708c55207b","Type":"ContainerStarted","Data":"53cfbb571537c8166c4fda96687e0c0478a3ccd4896ae36d90afe45af24dcc9a"} Apr 20 20:05:13.814321 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814304 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:05:13.814641 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814620 2575 generic.go:358] "Generic (PLEG): container finished" podID="f71db6ea-fd4d-4236-94af-1a0b3a3c623f" containerID="55969e5aaf7137a1eacae8203a7c59b2049763fd9061ee7f14b3cad966f9b4fe" exitCode=1 Apr 20 20:05:13.814725 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814691 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"ba1f7b3dfea84d7985c51035fb2349ca0bc379bd9b1db8b78bee4f38dd0eb8ce"} Apr 20 20:05:13.814725 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"fd4facee7206b0e5c2702962fa5b0f9a146ae78180e56f90c02d028791e10419"} Apr 20 20:05:13.814818 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"8ede7f5c9b9a33fa977dd00a881a7f2bb0e7689a7f05107763201a163660d217"} Apr 20 20:05:13.814818 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"143b30671fcfc5bf4cbb1b151bc37ab09cfc63a499ac949b7dce3c22fd21a029"} Apr 20 20:05:13.814818 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerDied","Data":"55969e5aaf7137a1eacae8203a7c59b2049763fd9061ee7f14b3cad966f9b4fe"} Apr 20 20:05:13.814818 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.814781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"17864ac50121e5940eff0303cdd8c31c281ed5cd0cff7c18ad97752ee7a55b2d"} Apr 20 20:05:13.815942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.815921 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="defda311017ea599662718aecf6c91fe9506ee4f02ec92854665b26e96483791" exitCode=0 Apr 20 20:05:13.816012 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.815958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"defda311017ea599662718aecf6c91fe9506ee4f02ec92854665b26e96483791"} Apr 20 20:05:13.831573 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.831532 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rlhqx" podStartSLOduration=3.7213966259999998 podStartE2EDuration="20.83151941s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.649985097 +0000 UTC m=+3.396905145" lastFinishedPulling="2026-04-20 20:05:12.760107857 +0000 UTC m=+20.507027929" observedRunningTime="2026-04-20 20:05:13.817061834 +0000 UTC m=+21.563981906" watchObservedRunningTime="2026-04-20 20:05:13.83151941 +0000 UTC m=+21.578439480" Apr 20 20:05:13.835066 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.832845 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9bpfb" podStartSLOduration=3.684133885 podStartE2EDuration="20.832831155s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.624424597 +0000 UTC m=+3.371344645" lastFinishedPulling="2026-04-20 20:05:12.773121863 +0000 UTC m=+20.520041915" observedRunningTime="2026-04-20 20:05:13.831413766 +0000 UTC m=+21.578333828" watchObservedRunningTime="2026-04-20 20:05:13.832831155 +0000 UTC m=+21.579751228" Apr 20 20:05:13.865775 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.865732 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-shngg" podStartSLOduration=12.873217565000001 podStartE2EDuration="21.865721344s" podCreationTimestamp="2026-04-20 20:04:52 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.62653945 +0000 UTC m=+3.373459512" lastFinishedPulling="2026-04-20 20:05:04.619043225 +0000 UTC m=+12.365963291" observedRunningTime="2026-04-20 20:05:13.865594659 +0000 UTC m=+21.612514730" watchObservedRunningTime="2026-04-20 20:05:13.865721344 +0000 UTC m=+21.612641414" Apr 20 20:05:13.894142 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.894095 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mnj4w" podStartSLOduration=3.753051973 podStartE2EDuration="20.894082754s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.620236618 +0000 UTC m=+3.367156672" lastFinishedPulling="2026-04-20 20:05:12.761267384 +0000 UTC m=+20.508187453" observedRunningTime="2026-04-20 20:05:13.893675033 +0000 UTC m=+21.640595106" watchObservedRunningTime="2026-04-20 20:05:13.894082754 +0000 UTC m=+21.641002824" Apr 20 20:05:13.894507 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:13.894481 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qcx6p" podStartSLOduration=3.7852693840000002 podStartE2EDuration="20.894473251s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.649949972 +0000 UTC m=+3.396870035" lastFinishedPulling="2026-04-20 20:05:12.759153839 +0000 UTC m=+20.506073902" observedRunningTime="2026-04-20 20:05:13.877038706 +0000 UTC m=+21.623958777" watchObservedRunningTime="2026-04-20 20:05:13.894473251 +0000 UTC m=+21.641393324" Apr 20 20:05:14.211564 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.211544 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:05:14.675694 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.675654 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:05:14.710143 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.710052 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:05:14.211560066Z","UUID":"fc96e816-d682-44d6-9cfd-b96917452e84","Handler":null,"Name":"","Endpoint":""} Apr 20 20:05:14.712834 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.712816 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:05:14.712834 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.712837 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:05:14.726834 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.726814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:14.726930 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.726850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:14.726968 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:14.726927 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:14.727009 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:14.726981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:14.819345 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.819311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bb4hw" event={"ID":"73708303-ce7f-455e-9004-696bd9c8fe9f","Type":"ContainerStarted","Data":"4dd652b94de50a12df85a4a18c882e10142a55cbfc260130ce2a4c352c0b6837"} Apr 20 20:05:14.821129 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.821099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" event={"ID":"32a74b02-cb51-4f5d-91a3-63f0dac5b718","Type":"ContainerStarted","Data":"c5c6cc08e8150f9e2bb9cf9e7e0312e89d16b7667c66e892cea69d9d52363ae3"} Apr 20 20:05:14.842456 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:14.842415 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bb4hw" podStartSLOduration=5.7819385180000005 podStartE2EDuration="22.842403937s" podCreationTimestamp="2026-04-20 20:04:52 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.650044359 +0000 UTC m=+3.396964426" lastFinishedPulling="2026-04-20 20:05:12.710509782 +0000 UTC m=+20.457429845" observedRunningTime="2026-04-20 20:05:14.842187095 +0000 UTC m=+22.589107156" watchObservedRunningTime="2026-04-20 20:05:14.842403937 +0000 UTC m=+22.589324004" Apr 20 20:05:15.388912 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:15.388884 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:05:15.389623 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:15.389603 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:05:15.824695 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:15.824665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" event={"ID":"32a74b02-cb51-4f5d-91a3-63f0dac5b718","Type":"ContainerStarted","Data":"bfcecc2d5b265da145bb47025919d561e1d91766069b19038852a17b7ea674fe"} Apr 20 20:05:15.825478 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:15.825454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-shngg" Apr 20 20:05:15.855912 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:15.855865 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zmp6c" podStartSLOduration=3.84399438 podStartE2EDuration="23.855842841s" podCreationTimestamp="2026-04-20 20:04:52 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.625904473 +0000 UTC m=+3.372824526" lastFinishedPulling="2026-04-20 20:05:15.637752923 +0000 UTC m=+23.384672987" observedRunningTime="2026-04-20 20:05:15.842110439 +0000 UTC m=+23.589030510" watchObservedRunningTime="2026-04-20 20:05:15.855842841 +0000 UTC m=+23.602762911" Apr 20 20:05:16.727171 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:16.726954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:16.727388 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:16.726954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:16.727388 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:16.727278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:16.727388 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:16.727364 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:16.829632 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:16.829600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:05:16.830055 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:16.830031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"e0f66dcb16791ee02294250904443592a5199d4e81bd4131c19c1878c1f2b7b0"} Apr 20 20:05:18.380084 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.380060 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-j6rtl"] Apr 20 20:05:18.403114 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.403093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.403224 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:18.403167 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j6rtl" podUID="135fcd8b-54c2-4eb1-bd4f-a23d6443033d" Apr 20 20:05:18.532422 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.532240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-dbus\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.532525 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.532470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-kubelet-config\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.532525 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.532498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.633767 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.633739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-dbus\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.633864 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.633803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-kubelet-config\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.633864 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.633832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.633955 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.633917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-kubelet-config\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.634000 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.633949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-dbus\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:18.634000 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:18.633963 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:18.634093 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:18.634025 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret podName:135fcd8b-54c2-4eb1-bd4f-a23d6443033d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:19.134007892 +0000 UTC m=+26.880927946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret") pod "global-pull-secret-syncer-j6rtl" (UID: "135fcd8b-54c2-4eb1-bd4f-a23d6443033d") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:18.727368 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.727307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:18.727466 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:18.727427 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:18.727522 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.727481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:18.727594 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:18.727570 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:18.835756 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.835736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:05:18.836085 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.836059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"9142e65d8951170bfc20936c6801523aa03770775818b4c5d41bff6f5b31750e"} Apr 20 20:05:18.836547 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.836518 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:18.836639 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.836555 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:18.836639 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.836597 2575 scope.go:117] "RemoveContainer" containerID="55969e5aaf7137a1eacae8203a7c59b2049763fd9061ee7f14b3cad966f9b4fe" Apr 20 20:05:18.837947 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.837910 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="540e39b458d20597daec09fe844563483d74e05ca6031ffed7584fb7522c4a76" exitCode=0 Apr 20 20:05:18.837947 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.837943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"540e39b458d20597daec09fe844563483d74e05ca6031ffed7584fb7522c4a76"} Apr 20 20:05:18.851115 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.851095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:18.851414 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:18.851395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:19.136536 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:19.136515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:19.136645 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:19.136625 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:19.136692 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:19.136673 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret podName:135fcd8b-54c2-4eb1-bd4f-a23d6443033d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.136657246 +0000 UTC m=+27.883577296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret") pod "global-pull-secret-syncer-j6rtl" (UID: "135fcd8b-54c2-4eb1-bd4f-a23d6443033d") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:19.844033 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:19.844005 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:05:19.844441 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:19.844418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" event={"ID":"f71db6ea-fd4d-4236-94af-1a0b3a3c623f","Type":"ContainerStarted","Data":"540b38eb459355c1f57ea538c14a7d836b273ba4e71c3274a89e1e182e5139b3"} Apr 20 20:05:19.844602 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:19.844584 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:19.876245 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:19.876205 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" podStartSLOduration=9.698600174 podStartE2EDuration="26.876193441s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.627364598 +0000 UTC m=+3.374284647" lastFinishedPulling="2026-04-20 20:05:12.804957851 +0000 UTC m=+20.551877914" observedRunningTime="2026-04-20 20:05:19.875971346 +0000 UTC m=+27.622891418" watchObservedRunningTime="2026-04-20 20:05:19.876193441 +0000 UTC m=+27.623113511" Apr 20 20:05:20.142635 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.142558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:20.142759 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:20.142689 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.142759 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:20.142749 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret podName:135fcd8b-54c2-4eb1-bd4f-a23d6443033d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:22.142734256 +0000 UTC m=+29.889654306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret") pod "global-pull-secret-syncer-j6rtl" (UID: "135fcd8b-54c2-4eb1-bd4f-a23d6443033d") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.182268 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.182224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gf2gx"] Apr 20 20:05:20.182396 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.182374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:20.182514 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:20.182487 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:20.184662 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.184641 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j6rtl"] Apr 20 20:05:20.184754 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.184727 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:20.184809 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:20.184794 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j6rtl" podUID="135fcd8b-54c2-4eb1-bd4f-a23d6443033d" Apr 20 20:05:20.187596 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.187577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rn94w"] Apr 20 20:05:20.187690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.187673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:20.187778 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:20.187759 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:20.847911 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.847886 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="b0251d57bfbf6781eacbcc667cc179e228b00ab7bf90347a99d5af5623ad314c" exitCode=0 Apr 20 20:05:20.848275 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.847971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"b0251d57bfbf6781eacbcc667cc179e228b00ab7bf90347a99d5af5623ad314c"} Apr 20 20:05:20.848275 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:20.848128 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:21.726978 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:21.726945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:21.726978 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:21.726977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:21.727216 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:21.726945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:21.727216 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:21.727048 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j6rtl" podUID="135fcd8b-54c2-4eb1-bd4f-a23d6443033d" Apr 20 20:05:21.727216 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:21.727105 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:21.727216 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:21.727192 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:22.147088 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:22.147036 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:22.157646 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:22.157623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:22.157769 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:22.157753 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:22.157833 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:22.157826 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret podName:135fcd8b-54c2-4eb1-bd4f-a23d6443033d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.157806428 +0000 UTC m=+33.904726494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret") pod "global-pull-secret-syncer-j6rtl" (UID: "135fcd8b-54c2-4eb1-bd4f-a23d6443033d") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:22.852328 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:22.852299 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="7868164467ca7f66439964d38d1450f3206cdd2d0a8622c00a1f3cd9b4adc41a" exitCode=0 Apr 20 20:05:22.852472 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:22.852359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"7868164467ca7f66439964d38d1450f3206cdd2d0a8622c00a1f3cd9b4adc41a"} Apr 20 20:05:23.727139 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:23.727106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:23.727601 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:23.727106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:23.727601 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:23.727217 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rn94w" podUID="b638336f-46b0-4174-be51-b9aa9a0f9341" Apr 20 20:05:23.727601 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:23.727106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:23.727601 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:23.727319 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j6rtl" podUID="135fcd8b-54c2-4eb1-bd4f-a23d6443033d" Apr 20 20:05:23.727601 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:23.727471 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:05:25.620297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.620124 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-19.ec2.internal" event="NodeReady" Apr 20 20:05:25.620681 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.620384 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:05:25.655060 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.655032 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr"] Apr 20 20:05:25.684313 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.684289 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:05:25.684474 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.684454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.686966 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.686944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:05:25.687157 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.686952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 20:05:25.687784 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.687764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-jhjdg\"" Apr 20 20:05:25.687784 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.687781 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:05:25.687934 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.687811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:05:25.696910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.696881 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2"] Apr 20 20:05:25.696910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.696906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.699562 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.699532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:05:25.699562 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.699547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j7xv8\"" Apr 20 20:05:25.699562 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.699559 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:05:25.699727 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.699535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:05:25.707924 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.707488 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:05:25.708925 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.708904 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk"] Apr 20 20:05:25.709272 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.709117 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.711411 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.711392 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 20:05:25.727188 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727170 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr"] Apr 20 20:05:25.727188 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727194 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2"] Apr 20 20:05:25.727354 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk"] Apr 20 20:05:25.727354 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727219 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:05:25.727354 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.727354 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2kdv4"] Apr 20 20:05:25.727554 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:25.727611 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:25.727919 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.727759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:25.729836 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.729816 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 20:05:25.730490 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.730468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:05:25.730841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.730814 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:05:25.730933 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.730845 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 20:05:25.731886 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.731866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 20:05:25.731968 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.731914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:05:25.731968 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.731940 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:05:25.732041 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.731979 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:05:25.732090 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.732052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 20:05:25.732151 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.732118 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lvnms\"" Apr 20 20:05:25.750724 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.750705 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j7twh"] Apr 20 20:05:25.750854 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.750839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.754142 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.754129 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:05:25.754278 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.754165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:05:25.754551 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.754522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:05:25.754650 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.754583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:05:25.770041 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.769663 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2kdv4"] Apr 20 20:05:25.770041 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.769688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7twh"] Apr 20 20:05:25.770041 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.769797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.772652 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.772612 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:05:25.772652 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.772623 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:05:25.772788 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.772650 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:05:25.786393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786515 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5jq6\" (UniqueName: \"kubernetes.io/projected/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-kube-api-access-c5jq6\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.786515 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786609 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786609 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786697 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786697 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786797 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmt6\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.786797 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.786763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.887320 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887444 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmt6\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887444 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.887444 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/45b9b480-0f24-4767-bcf2-c039e9306050-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.887603 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887603 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5jq6\" (UniqueName: \"kubernetes.io/projected/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-kube-api-access-c5jq6\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.887707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77bd0885-5141-4657-8ae7-140bbc18a034-tmp-dir\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.887707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cm2\" (UniqueName: \"kubernetes.io/projected/77bd0885-5141-4657-8ae7-140bbc18a034-kube-api-access-x6cm2\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.887707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgh7\" (UniqueName: \"kubernetes.io/projected/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-kube-api-access-5dgh7\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f235a79-de42-4459-9343-0a85ee8df4d6-tmp\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.887942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.887960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77bd0885-5141-4657-8ae7-140bbc18a034-config-volume\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggqv\" (UniqueName: \"kubernetes.io/projected/8f235a79-de42-4459-9343-0a85ee8df4d6-kube-api-access-8ggqv\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.888305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.888563 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvzv\" (UniqueName: \"kubernetes.io/projected/45b9b480-0f24-4767-bcf2-c039e9306050-kube-api-access-nbvzv\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.888563 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.888563 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.888461 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:25.888563 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.888473 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:25.888563 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.888526 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.388510851 +0000 UTC m=+34.135430900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:25.888769 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8f235a79-de42-4459-9343-0a85ee8df4d6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.888769 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.888769 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.888703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.892124 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.892103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.892340 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.892320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.892503 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.892467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.897299 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.897281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmt6\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.897383 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.897300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:25.897383 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.897359 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5jq6\" (UniqueName: \"kubernetes.io/projected/eb78df99-97b1-4b39-bb9d-7d5ce2659db1-kube-api-access-c5jq6\") pod \"managed-serviceaccount-addon-agent-754f5c7b67-s2xdr\" (UID: \"eb78df99-97b1-4b39-bb9d-7d5ce2659db1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:25.988954 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.988934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77bd0885-5141-4657-8ae7-140bbc18a034-config-volume\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989061 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.988969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989061 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggqv\" (UniqueName: \"kubernetes.io/projected/8f235a79-de42-4459-9343-0a85ee8df4d6-kube-api-access-8ggqv\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.989061 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvzv\" (UniqueName: \"kubernetes.io/projected/45b9b480-0f24-4767-bcf2-c039e9306050-kube-api-access-nbvzv\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8f235a79-de42-4459-9343-0a85ee8df4d6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/45b9b480-0f24-4767-bcf2-c039e9306050-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77bd0885-5141-4657-8ae7-140bbc18a034-tmp-dir\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cm2\" (UniqueName: \"kubernetes.io/projected/77bd0885-5141-4657-8ae7-140bbc18a034-kube-api-access-x6cm2\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgh7\" (UniqueName: \"kubernetes.io/projected/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-kube-api-access-5dgh7\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f235a79-de42-4459-9343-0a85ee8df4d6-tmp\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77bd0885-5141-4657-8ae7-140bbc18a034-config-volume\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/77bd0885-5141-4657-8ae7-140bbc18a034-tmp-dir\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.989649 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:25.989707 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.989711 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.489694034 +0000 UTC m=+34.236614091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:25.990100 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f235a79-de42-4459-9343-0a85ee8df4d6-tmp\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.990100 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.989883 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:25.990100 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.989915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/45b9b480-0f24-4767-bcf2-c039e9306050-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.990100 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:25.989948 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:26.48993107 +0000 UTC m=+34.236851122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:25.991809 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.991781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-ca\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.992506 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.992485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.992600 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.992515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.992758 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.992738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/45b9b480-0f24-4767-bcf2-c039e9306050-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.993372 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.993356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8f235a79-de42-4459-9343-0a85ee8df4d6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:25.998958 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.998916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cm2\" (UniqueName: \"kubernetes.io/projected/77bd0885-5141-4657-8ae7-140bbc18a034-kube-api-access-x6cm2\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:25.999453 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.999408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgh7\" (UniqueName: \"kubernetes.io/projected/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-kube-api-access-5dgh7\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:25.999547 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.999474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvzv\" (UniqueName: \"kubernetes.io/projected/45b9b480-0f24-4767-bcf2-c039e9306050-kube-api-access-nbvzv\") pod \"cluster-proxy-proxy-agent-67886544cd-bjjtk\" (UID: \"45b9b480-0f24-4767-bcf2-c039e9306050\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:25.999867 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:25.999845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggqv\" (UniqueName: \"kubernetes.io/projected/8f235a79-de42-4459-9343-0a85ee8df4d6-kube-api-access-8ggqv\") pod \"klusterlet-addon-workmgr-77cfb99f57-mpnl2\" (UID: \"8f235a79-de42-4459-9343-0a85ee8df4d6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:26.003591 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.003572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" Apr 20 20:05:26.021438 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.021416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:26.038342 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.038280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:05:26.191541 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.191510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:26.193922 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.193898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/135fcd8b-54c2-4eb1-bd4f-a23d6443033d-original-pull-secret\") pod \"global-pull-secret-syncer-j6rtl\" (UID: \"135fcd8b-54c2-4eb1-bd4f-a23d6443033d\") " pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:26.206347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.206325 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr"] Apr 20 20:05:26.210397 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:05:26.210366 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb78df99_97b1_4b39_bb9d_7d5ce2659db1.slice/crio-d9aa5302a313f5993c9408e3eb3cb184bff2ffd0420fa67d1800e163c0ad4960 WatchSource:0}: Error finding container d9aa5302a313f5993c9408e3eb3cb184bff2ffd0420fa67d1800e163c0ad4960: Status 404 returned error can't find the container with id d9aa5302a313f5993c9408e3eb3cb184bff2ffd0420fa67d1800e163c0ad4960 Apr 20 20:05:26.211533 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.211444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2"] Apr 20 20:05:26.212688 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.212669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk"] Apr 20 20:05:26.213655 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:05:26.213635 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45b9b480_0f24_4767_bcf2_c039e9306050.slice/crio-757069e05dcdf2c205f94c3f9f288b0b47587e0ce2fc44dea91cf2427663f9a4 WatchSource:0}: Error finding container 757069e05dcdf2c205f94c3f9f288b0b47587e0ce2fc44dea91cf2427663f9a4: Status 404 returned error can't find the container with id 757069e05dcdf2c205f94c3f9f288b0b47587e0ce2fc44dea91cf2427663f9a4 Apr 20 20:05:26.217570 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:05:26.217547 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f235a79_de42_4459_9343_0a85ee8df4d6.slice/crio-fb825724a9662822dcb4254fa6795e4df485eb9c0655434bb6cc5e11272e8b76 WatchSource:0}: Error finding container fb825724a9662822dcb4254fa6795e4df485eb9c0655434bb6cc5e11272e8b76: Status 404 returned error can't find the container with id fb825724a9662822dcb4254fa6795e4df485eb9c0655434bb6cc5e11272e8b76 Apr 20 20:05:26.367691 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.367616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j6rtl" Apr 20 20:05:26.393410 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.393376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:26.393539 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.393429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:26.393584 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.393562 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:26.393584 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.393577 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:26.393656 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.393633 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.393618294 +0000 UTC m=+35.140538348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:26.396566 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.396542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbl2\" (UniqueName: \"kubernetes.io/projected/b638336f-46b0-4174-be51-b9aa9a0f9341-kube-api-access-dfbl2\") pod \"network-check-target-rn94w\" (UID: \"b638336f-46b0-4174-be51-b9aa9a0f9341\") " pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:26.493932 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.493899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:26.494079 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.493950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:26.494079 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.494004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:26.494079 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494044 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:26.494201 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494090 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:05:26.494201 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.494092013 +0000 UTC m=+35.241012072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:26.494201 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494148 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:26.494201 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494164 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.494153914 +0000 UTC m=+66.241073968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : secret "metrics-daemon-secret" not found Apr 20 20:05:26.494201 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:26.494196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.494181246 +0000 UTC m=+35.241101298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:26.663172 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.663095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:26.860906 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.860681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" event={"ID":"8f235a79-de42-4459-9343-0a85ee8df4d6","Type":"ContainerStarted","Data":"fb825724a9662822dcb4254fa6795e4df485eb9c0655434bb6cc5e11272e8b76"} Apr 20 20:05:26.861863 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.861836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerStarted","Data":"757069e05dcdf2c205f94c3f9f288b0b47587e0ce2fc44dea91cf2427663f9a4"} Apr 20 20:05:26.863032 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:26.863008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" event={"ID":"eb78df99-97b1-4b39-bb9d-7d5ce2659db1","Type":"ContainerStarted","Data":"d9aa5302a313f5993c9408e3eb3cb184bff2ffd0420fa67d1800e163c0ad4960"} Apr 20 20:05:27.401048 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:27.401015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:27.401275 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.401174 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:27.401275 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.401193 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:27.401275 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.401263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.401233749 +0000 UTC m=+37.148153798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:27.502355 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:27.502325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:27.502534 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:27.502386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:27.502534 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.502474 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:27.502534 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.502533 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.502511356 +0000 UTC m=+37.249431405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:27.502701 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.502474 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:27.502701 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:27.502591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.502577861 +0000 UTC m=+37.249497921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:28.533356 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:28.533329 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j6rtl"] Apr 20 20:05:28.536224 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:05:28.536196 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135fcd8b_54c2_4eb1_bd4f_a23d6443033d.slice/crio-a540c2a50d06de8af498e863aa099699a5bb09dc51b0b34bff79725e555eb66a WatchSource:0}: Error finding container a540c2a50d06de8af498e863aa099699a5bb09dc51b0b34bff79725e555eb66a: Status 404 returned error can't find the container with id a540c2a50d06de8af498e863aa099699a5bb09dc51b0b34bff79725e555eb66a Apr 20 20:05:28.538555 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:28.538533 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rn94w"] Apr 20 20:05:28.541426 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:05:28.541405 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb638336f_46b0_4174_be51_b9aa9a0f9341.slice/crio-dc34956c3afd27e1c0a8a4d348594ca8e55edcd10e2f22933911543271786951 WatchSource:0}: Error finding container dc34956c3afd27e1c0a8a4d348594ca8e55edcd10e2f22933911543271786951: Status 404 returned error can't find the container with id dc34956c3afd27e1c0a8a4d348594ca8e55edcd10e2f22933911543271786951 Apr 20 20:05:28.869154 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:28.869096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rn94w" event={"ID":"b638336f-46b0-4174-be51-b9aa9a0f9341","Type":"ContainerStarted","Data":"dc34956c3afd27e1c0a8a4d348594ca8e55edcd10e2f22933911543271786951"} Apr 20 20:05:28.871437 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:28.871231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j6rtl" event={"ID":"135fcd8b-54c2-4eb1-bd4f-a23d6443033d","Type":"ContainerStarted","Data":"a540c2a50d06de8af498e863aa099699a5bb09dc51b0b34bff79725e555eb66a"} Apr 20 20:05:29.419910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:29.419872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:29.420079 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.420028 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:29.420079 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.420061 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:29.420192 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.420123 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:33.420103894 +0000 UTC m=+41.167023945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:29.521164 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:29.521091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:29.521340 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:29.521241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:29.521501 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.521470 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:29.521607 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.521555 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:33.521534799 +0000 UTC m=+41.268454857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:29.522023 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.522004 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:29.522125 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:29.522058 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:33.522042805 +0000 UTC m=+41.268962859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:29.882194 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:29.882160 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="1b22fd88c39cf7a7400342fe6a27c231d410de33e7308ca4f475061a468c8e4a" exitCode=0 Apr 20 20:05:29.882651 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:29.882220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"1b22fd88c39cf7a7400342fe6a27c231d410de33e7308ca4f475061a468c8e4a"} Apr 20 20:05:30.895738 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:30.894778 2575 generic.go:358] "Generic (PLEG): container finished" podID="397c50ea-cc56-4ac2-a69e-090e94977ed9" containerID="d78900e529eeff17468ca7b6fc889b6992efbdf54cfb8acbac9d10702f2ab902" exitCode=0 Apr 20 20:05:30.895738 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:30.894853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerDied","Data":"d78900e529eeff17468ca7b6fc889b6992efbdf54cfb8acbac9d10702f2ab902"} Apr 20 20:05:33.454295 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:33.454229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:33.454694 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.454353 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:33.454694 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.454368 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:33.454694 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.454419 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:41.45440355 +0000 UTC m=+49.201323610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:33.554631 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:33.554601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:33.554799 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:33.554654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:33.554799 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.554754 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:33.554799 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.554754 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:33.554916 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.554804 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:41.554789279 +0000 UTC m=+49.301709333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:33.554916 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:33.554819 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:41.554811845 +0000 UTC m=+49.301731894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:37.911831 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.911792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rn94w" event={"ID":"b638336f-46b0-4174-be51-b9aa9a0f9341","Type":"ContainerStarted","Data":"c078e229552e04f1f39bd193ef157527b67010ed833d5453090d532bde705069"} Apr 20 20:05:37.912233 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.911946 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:05:37.913325 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.913294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j6rtl" event={"ID":"135fcd8b-54c2-4eb1-bd4f-a23d6443033d","Type":"ContainerStarted","Data":"23517a4bdab7f97893a401fffb45eafad133eabf3a6a6032dd1a918bc50b8eb5"} Apr 20 20:05:37.916460 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.916434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5x687" event={"ID":"397c50ea-cc56-4ac2-a69e-090e94977ed9","Type":"ContainerStarted","Data":"0f75a74ba8d0a6e86485c2df9e96b745684fca74f6e2ab0cef3f39d0cb91c597"} Apr 20 20:05:37.917720 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.917689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" event={"ID":"8f235a79-de42-4459-9343-0a85ee8df4d6","Type":"ContainerStarted","Data":"8f34a845ec1094f27f2010512f064357a76f49d55afe9f3d2ae0bb21bc4937bd"} Apr 20 20:05:37.917917 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.917895 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:37.919121 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.919085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" event={"ID":"eb78df99-97b1-4b39-bb9d-7d5ce2659db1","Type":"ContainerStarted","Data":"cb5783e2f16d88ba8f1f4dee60f55f2afe4ac445d48c52c7bad50b5b4716b8a2"} Apr 20 20:05:37.919872 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.919851 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:05:37.920468 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.920448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerStarted","Data":"65edf9379f7bc09b181d3f46c6738371f147c897655d97782f3fd878da7dd7d7"} Apr 20 20:05:37.928535 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.928494 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rn94w" podStartSLOduration=37.469828129 podStartE2EDuration="45.928481917s" podCreationTimestamp="2026-04-20 20:04:52 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.543274652 +0000 UTC m=+36.290194701" lastFinishedPulling="2026-04-20 20:05:37.001928438 +0000 UTC m=+44.748848489" observedRunningTime="2026-04-20 20:05:37.927320961 +0000 UTC m=+45.674241051" watchObservedRunningTime="2026-04-20 20:05:37.928481917 +0000 UTC m=+45.675401981" Apr 20 20:05:37.942417 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.942378 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" podStartSLOduration=24.146350127 podStartE2EDuration="34.942366616s" podCreationTimestamp="2026-04-20 20:05:03 +0000 UTC" firstStartedPulling="2026-04-20 20:05:26.219411966 +0000 UTC m=+33.966332015" lastFinishedPulling="2026-04-20 20:05:37.015428442 +0000 UTC m=+44.762348504" observedRunningTime="2026-04-20 20:05:37.942058795 +0000 UTC m=+45.688978865" watchObservedRunningTime="2026-04-20 20:05:37.942366616 +0000 UTC m=+45.689286687" Apr 20 20:05:37.955437 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.955402 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-j6rtl" podStartSLOduration=11.17024566 podStartE2EDuration="19.955392622s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.538147398 +0000 UTC m=+36.285067447" lastFinishedPulling="2026-04-20 20:05:37.323294346 +0000 UTC m=+45.070214409" observedRunningTime="2026-04-20 20:05:37.955117431 +0000 UTC m=+45.702037504" watchObservedRunningTime="2026-04-20 20:05:37.955392622 +0000 UTC m=+45.702312693" Apr 20 20:05:37.976811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.976769 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5x687" podStartSLOduration=11.82487468 podStartE2EDuration="44.976759132s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:04:55.626485742 +0000 UTC m=+3.373405791" lastFinishedPulling="2026-04-20 20:05:28.778370188 +0000 UTC m=+36.525290243" observedRunningTime="2026-04-20 20:05:37.974980136 +0000 UTC m=+45.721900206" watchObservedRunningTime="2026-04-20 20:05:37.976759132 +0000 UTC m=+45.723679200" Apr 20 20:05:37.989583 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:37.989553 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" podStartSLOduration=24.20369175 podStartE2EDuration="34.989543687s" podCreationTimestamp="2026-04-20 20:05:03 +0000 UTC" firstStartedPulling="2026-04-20 20:05:26.212813562 +0000 UTC m=+33.959733624" lastFinishedPulling="2026-04-20 20:05:36.998665506 +0000 UTC m=+44.745585561" observedRunningTime="2026-04-20 20:05:37.989348162 +0000 UTC m=+45.736268238" watchObservedRunningTime="2026-04-20 20:05:37.989543687 +0000 UTC m=+45.736463756" Apr 20 20:05:40.930313 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:40.930278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerStarted","Data":"73628eb8e932d35e10d3400494f0aec5c5bb0d3092f073959ff6605a7b4b8990"} Apr 20 20:05:40.930313 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:40.930314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerStarted","Data":"f46371e3a44a780ddfa9e08fe266d4c2584c50bb5e79bbc55e64d5cdc25acf90"} Apr 20 20:05:40.954887 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:40.954842 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" podStartSLOduration=23.775216075 podStartE2EDuration="37.95483038s" podCreationTimestamp="2026-04-20 20:05:03 +0000 UTC" firstStartedPulling="2026-04-20 20:05:26.216022644 +0000 UTC m=+33.962942708" lastFinishedPulling="2026-04-20 20:05:40.395636955 +0000 UTC m=+48.142557013" observedRunningTime="2026-04-20 20:05:40.953713034 +0000 UTC m=+48.700633109" watchObservedRunningTime="2026-04-20 20:05:40.95483038 +0000 UTC m=+48.701750451" Apr 20 20:05:41.512567 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:41.512532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:41.512708 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.512640 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:41.512708 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.512657 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:41.512838 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.512729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:57.512709587 +0000 UTC m=+65.259629638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:41.613459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:41.613426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:41.613600 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:41.613482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:41.613600 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.613553 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:41.613600 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.613576 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:41.613719 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.613616 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:57.613601358 +0000 UTC m=+65.360521407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:41.613719 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:41.613632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:57.61362523 +0000 UTC m=+65.360545279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:52.166162 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:52.166126 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwd67" Apr 20 20:05:57.517538 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:57.517495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:05:57.517951 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.517634 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:57.517951 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.517649 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:05:57.517951 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.517706 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.517692976 +0000 UTC m=+97.264613030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:05:57.618314 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:57.618282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:05:57.618431 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:57.618338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:05:57.618431 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.618416 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:57.618431 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.618418 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:57.618524 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.618462 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.618449845 +0000 UTC m=+97.365369895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:05:57.618524 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:57.618474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.618468416 +0000 UTC m=+97.365388465 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:05:58.524688 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:05:58.524652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:05:58.525064 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:58.524769 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:05:58.525064 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:05:58.524834 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.524816344 +0000 UTC m=+130.271736393 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : secret "metrics-daemon-secret" not found Apr 20 20:06:08.926182 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:06:08.926069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rn94w" Apr 20 20:06:29.525144 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:06:29.525101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:06:29.525664 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.525280 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:29.525664 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.525305 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7db79d6bf5-nq2dr: secret "image-registry-tls" not found Apr 20 20:06:29.525664 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.525386 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls podName:39251b0b-f5d0-45a9-bc58-4e02fa4a556d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:33.525364278 +0000 UTC m=+161.272284329 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls") pod "image-registry-7db79d6bf5-nq2dr" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d") : secret "image-registry-tls" not found Apr 20 20:06:29.626437 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:06:29.626404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:06:29.626626 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:06:29.626456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:06:29.626626 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.626570 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:29.626626 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.626572 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:29.626757 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.626640 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls podName:77bd0885-5141-4657-8ae7-140bbc18a034 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:33.626622773 +0000 UTC m=+161.373542822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls") pod "dns-default-j7twh" (UID: "77bd0885-5141-4657-8ae7-140bbc18a034") : secret "dns-default-metrics-tls" not found Apr 20 20:06:29.626757 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:06:29.626657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert podName:c8c422a0-bcac-4afe-96fe-8b9874ed46e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:33.626650542 +0000 UTC m=+161.373570590 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert") pod "ingress-canary-2kdv4" (UID: "c8c422a0-bcac-4afe-96fe-8b9874ed46e5") : secret "canary-serving-cert" not found Apr 20 20:07:02.539749 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:02.539703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:07:02.540237 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:02.539840 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:02.540237 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:02.539917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs podName:de77e01d-c1e5-4a7e-99df-1261a9d21bed nodeName:}" failed. No retries permitted until 2026-04-20 20:09:04.539901262 +0000 UTC m=+252.286821310 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs") pod "network-metrics-daemon-gf2gx" (UID: "de77e01d-c1e5-4a7e-99df-1261a9d21bed") : secret "metrics-daemon-secret" not found Apr 20 20:07:19.419517 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:19.419491 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rlhqx_6c86c1d8-db05-4fb0-9906-f6e203ab0fc0/dns-node-resolver/0.log" Apr 20 20:07:20.220270 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:20.220231 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qcx6p_0ffcd701-9c8c-423c-adee-9e708c55207b/node-ca/0.log" Apr 20 20:07:28.716064 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:28.716023 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" Apr 20 20:07:28.760262 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:28.760226 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gf2gx" podUID="de77e01d-c1e5-4a7e-99df-1261a9d21bed" Apr 20 20:07:28.773443 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:28.773413 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2kdv4" podUID="c8c422a0-bcac-4afe-96fe-8b9874ed46e5" Apr 20 20:07:28.780495 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:28.780469 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j7twh" podUID="77bd0885-5141-4657-8ae7-140bbc18a034" Apr 20 20:07:29.197910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:29.197884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:07:29.197910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:29.197895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:33.550883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.550837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:33.554140 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.554110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"image-registry-7db79d6bf5-nq2dr\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:33.651489 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.651449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:07:33.651668 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.651511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:33.653743 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.653717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77bd0885-5141-4657-8ae7-140bbc18a034-metrics-tls\") pod \"dns-default-j7twh\" (UID: \"77bd0885-5141-4657-8ae7-140bbc18a034\") " pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:33.653844 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.653824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c422a0-bcac-4afe-96fe-8b9874ed46e5-cert\") pod \"ingress-canary-2kdv4\" (UID: \"c8c422a0-bcac-4afe-96fe-8b9874ed46e5\") " pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:07:33.702176 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.702152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j7xv8\"" Apr 20 20:07:33.702347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.702152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:07:33.709409 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.709384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:33.709409 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.709400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2kdv4" Apr 20 20:07:33.831643 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.831613 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2kdv4"] Apr 20 20:07:33.834618 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:07:33.834595 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c422a0_bcac_4afe_96fe_8b9874ed46e5.slice/crio-9239f12a3ac09aa6c56abcd5f27b439aa31195a410c75a18a697a79276ad24a5 WatchSource:0}: Error finding container 9239f12a3ac09aa6c56abcd5f27b439aa31195a410c75a18a697a79276ad24a5: Status 404 returned error can't find the container with id 9239f12a3ac09aa6c56abcd5f27b439aa31195a410c75a18a697a79276ad24a5 Apr 20 20:07:33.852983 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:33.852964 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:07:33.855058 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:07:33.855032 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39251b0b_f5d0_45a9_bc58_4e02fa4a556d.slice/crio-49d818a1d877c3913e3db4f6ef8e8bc87d6ca6d7321c30c8b81ca775d85faa8e WatchSource:0}: Error finding container 49d818a1d877c3913e3db4f6ef8e8bc87d6ca6d7321c30c8b81ca775d85faa8e: Status 404 returned error can't find the container with id 49d818a1d877c3913e3db4f6ef8e8bc87d6ca6d7321c30c8b81ca775d85faa8e Apr 20 20:07:34.211840 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:34.211734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2kdv4" event={"ID":"c8c422a0-bcac-4afe-96fe-8b9874ed46e5","Type":"ContainerStarted","Data":"9239f12a3ac09aa6c56abcd5f27b439aa31195a410c75a18a697a79276ad24a5"} Apr 20 20:07:34.213284 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:34.213234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" event={"ID":"39251b0b-f5d0-45a9-bc58-4e02fa4a556d","Type":"ContainerStarted","Data":"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2"} Apr 20 20:07:34.213434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:34.213291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" event={"ID":"39251b0b-f5d0-45a9-bc58-4e02fa4a556d","Type":"ContainerStarted","Data":"49d818a1d877c3913e3db4f6ef8e8bc87d6ca6d7321c30c8b81ca775d85faa8e"} Apr 20 20:07:34.213434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:34.213421 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:34.232087 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:34.232041 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" podStartSLOduration=161.232026482 podStartE2EDuration="2m41.232026482s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:34.231013511 +0000 UTC m=+161.977933583" watchObservedRunningTime="2026-04-20 20:07:34.232026482 +0000 UTC m=+161.978946558" Apr 20 20:07:36.218964 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:36.218925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2kdv4" event={"ID":"c8c422a0-bcac-4afe-96fe-8b9874ed46e5","Type":"ContainerStarted","Data":"e43fdbd7efda2a5de08ff31ce491dab6bcb1b7c3550d70d7cdd02f97c3181286"} Apr 20 20:07:36.235166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:36.235119 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2kdv4" podStartSLOduration=129.518554379 podStartE2EDuration="2m11.235106803s" podCreationTimestamp="2026-04-20 20:05:25 +0000 UTC" firstStartedPulling="2026-04-20 20:07:33.836422555 +0000 UTC m=+161.583342605" lastFinishedPulling="2026-04-20 20:07:35.55297498 +0000 UTC m=+163.299895029" observedRunningTime="2026-04-20 20:07:36.234394539 +0000 UTC m=+163.981314610" watchObservedRunningTime="2026-04-20 20:07:36.235106803 +0000 UTC m=+163.982026874" Apr 20 20:07:37.224518 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:37.224492 2575 generic.go:358] "Generic (PLEG): container finished" podID="eb78df99-97b1-4b39-bb9d-7d5ce2659db1" containerID="cb5783e2f16d88ba8f1f4dee60f55f2afe4ac445d48c52c7bad50b5b4716b8a2" exitCode=255 Apr 20 20:07:37.224781 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:37.224526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" event={"ID":"eb78df99-97b1-4b39-bb9d-7d5ce2659db1","Type":"ContainerDied","Data":"cb5783e2f16d88ba8f1f4dee60f55f2afe4ac445d48c52c7bad50b5b4716b8a2"} Apr 20 20:07:37.224913 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:37.224898 2575 scope.go:117] "RemoveContainer" containerID="cb5783e2f16d88ba8f1f4dee60f55f2afe4ac445d48c52c7bad50b5b4716b8a2" Apr 20 20:07:37.918835 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:37.918784 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" podUID="8f235a79-de42-4459-9343-0a85ee8df4d6" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 20 20:07:38.228368 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:38.228284 2575 generic.go:358] "Generic (PLEG): container finished" podID="8f235a79-de42-4459-9343-0a85ee8df4d6" containerID="8f34a845ec1094f27f2010512f064357a76f49d55afe9f3d2ae0bb21bc4937bd" exitCode=1 Apr 20 20:07:38.228368 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:38.228359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" event={"ID":"8f235a79-de42-4459-9343-0a85ee8df4d6","Type":"ContainerDied","Data":"8f34a845ec1094f27f2010512f064357a76f49d55afe9f3d2ae0bb21bc4937bd"} Apr 20 20:07:38.228803 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:38.228694 2575 scope.go:117] "RemoveContainer" containerID="8f34a845ec1094f27f2010512f064357a76f49d55afe9f3d2ae0bb21bc4937bd" Apr 20 20:07:38.230048 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:38.230030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754f5c7b67-s2xdr" event={"ID":"eb78df99-97b1-4b39-bb9d-7d5ce2659db1","Type":"ContainerStarted","Data":"8b8c12d9d38ecd3d3b96cf34a8d4ab4604162b0ae56dd60e4a381f87651b337d"} Apr 20 20:07:39.233913 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:39.233881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" event={"ID":"8f235a79-de42-4459-9343-0a85ee8df4d6","Type":"ContainerStarted","Data":"817049e6c08d0597491d45ee68f78766413afd095b4025d0239659124c096e8e"} Apr 20 20:07:39.234276 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:39.234130 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:07:39.234722 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:39.234705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77cfb99f57-mpnl2" Apr 20 20:07:39.727427 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:39.727395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:07:40.726870 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:40.726834 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:40.729698 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:40.729679 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:07:40.737596 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:40.737580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:40.848404 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:40.848371 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7twh"] Apr 20 20:07:40.851473 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:07:40.851446 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77bd0885_5141_4657_8ae7_140bbc18a034.slice/crio-b37a9c04f72124f364731b216e40128cfb5d82dcd4734385f52e6e0e94d823ae WatchSource:0}: Error finding container b37a9c04f72124f364731b216e40128cfb5d82dcd4734385f52e6e0e94d823ae: Status 404 returned error can't find the container with id b37a9c04f72124f364731b216e40128cfb5d82dcd4734385f52e6e0e94d823ae Apr 20 20:07:41.238901 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.238874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7twh" event={"ID":"77bd0885-5141-4657-8ae7-140bbc18a034","Type":"ContainerStarted","Data":"b37a9c04f72124f364731b216e40128cfb5d82dcd4734385f52e6e0e94d823ae"} Apr 20 20:07:41.648681 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.648643 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rsxtz"] Apr 20 20:07:41.652082 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.652057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.654581 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.654554 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:07:41.654732 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.654712 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:07:41.654732 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.654724 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:07:41.655740 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.655719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2m5\"" Apr 20 20:07:41.655832 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.655776 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:07:41.664161 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.664140 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rsxtz"] Apr 20 20:07:41.708038 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.708016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e848e523-e953-4cef-b352-34e3d9adf16c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.708156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.708068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fnkx\" (UniqueName: \"kubernetes.io/projected/e848e523-e953-4cef-b352-34e3d9adf16c-kube-api-access-9fnkx\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.708156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.708133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e848e523-e953-4cef-b352-34e3d9adf16c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.708271 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.708188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e848e523-e953-4cef-b352-34e3d9adf16c-data-volume\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.708271 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.708222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e848e523-e953-4cef-b352-34e3d9adf16c-crio-socket\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809454 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e848e523-e953-4cef-b352-34e3d9adf16c-data-volume\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809454 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e848e523-e953-4cef-b352-34e3d9adf16c-crio-socket\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e848e523-e953-4cef-b352-34e3d9adf16c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fnkx\" (UniqueName: \"kubernetes.io/projected/e848e523-e953-4cef-b352-34e3d9adf16c-kube-api-access-9fnkx\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e848e523-e953-4cef-b352-34e3d9adf16c-crio-socket\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e848e523-e953-4cef-b352-34e3d9adf16c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.809857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.809744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e848e523-e953-4cef-b352-34e3d9adf16c-data-volume\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.810596 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.810575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e848e523-e953-4cef-b352-34e3d9adf16c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.811690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.811673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e848e523-e953-4cef-b352-34e3d9adf16c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.818881 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.818857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fnkx\" (UniqueName: \"kubernetes.io/projected/e848e523-e953-4cef-b352-34e3d9adf16c-kube-api-access-9fnkx\") pod \"insights-runtime-extractor-rsxtz\" (UID: \"e848e523-e953-4cef-b352-34e3d9adf16c\") " pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:41.964292 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:41.964205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rsxtz" Apr 20 20:07:42.277911 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:42.277883 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rsxtz"] Apr 20 20:07:42.281916 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:07:42.281884 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode848e523_e953_4cef_b352_34e3d9adf16c.slice/crio-632542a597cd0d65f59a56be9f78f1f2366224ad7b7dc7e91b444dd1144b2a21 WatchSource:0}: Error finding container 632542a597cd0d65f59a56be9f78f1f2366224ad7b7dc7e91b444dd1144b2a21: Status 404 returned error can't find the container with id 632542a597cd0d65f59a56be9f78f1f2366224ad7b7dc7e91b444dd1144b2a21 Apr 20 20:07:43.246989 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.246954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7twh" event={"ID":"77bd0885-5141-4657-8ae7-140bbc18a034","Type":"ContainerStarted","Data":"22976d634d1617219a029e922a494ebcb0940fd567625656c9871f0438da25e4"} Apr 20 20:07:43.246989 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.246990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7twh" event={"ID":"77bd0885-5141-4657-8ae7-140bbc18a034","Type":"ContainerStarted","Data":"ebf6a8ed891ec3fcd70aca2a4e7b12690fbafbd87bfd5d6480dfe5dc53f9e1ec"} Apr 20 20:07:43.247456 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.247055 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:43.248440 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.248421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rsxtz" event={"ID":"e848e523-e953-4cef-b352-34e3d9adf16c","Type":"ContainerStarted","Data":"d2003e3da42b6a253618e93f56ec48977671e5b25ae4db23d9ce34a56395f0c2"} Apr 20 20:07:43.248440 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.248443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rsxtz" event={"ID":"e848e523-e953-4cef-b352-34e3d9adf16c","Type":"ContainerStarted","Data":"4bb722721c147293d89b870641f932e5592c03671f7e13269530f57452784fd9"} Apr 20 20:07:43.248564 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.248453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rsxtz" event={"ID":"e848e523-e953-4cef-b352-34e3d9adf16c","Type":"ContainerStarted","Data":"632542a597cd0d65f59a56be9f78f1f2366224ad7b7dc7e91b444dd1144b2a21"} Apr 20 20:07:43.263295 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:43.263228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j7twh" podStartSLOduration=136.912105645 podStartE2EDuration="2m18.263218226s" podCreationTimestamp="2026-04-20 20:05:25 +0000 UTC" firstStartedPulling="2026-04-20 20:07:40.853318636 +0000 UTC m=+168.600238686" lastFinishedPulling="2026-04-20 20:07:42.204431214 +0000 UTC m=+169.951351267" observedRunningTime="2026-04-20 20:07:43.262312734 +0000 UTC m=+171.009232805" watchObservedRunningTime="2026-04-20 20:07:43.263218226 +0000 UTC m=+171.010138296" Apr 20 20:07:45.255683 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:45.255648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rsxtz" event={"ID":"e848e523-e953-4cef-b352-34e3d9adf16c","Type":"ContainerStarted","Data":"38ee507180ebd1fd45aafdaff307c4c5d6f84d381d8de9c49cf88615cdcb42ef"} Apr 20 20:07:45.272883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:45.272837 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rsxtz" podStartSLOduration=2.342444099 podStartE2EDuration="4.272825255s" podCreationTimestamp="2026-04-20 20:07:41 +0000 UTC" firstStartedPulling="2026-04-20 20:07:42.323318259 +0000 UTC m=+170.070238314" lastFinishedPulling="2026-04-20 20:07:44.253699421 +0000 UTC m=+172.000619470" observedRunningTime="2026-04-20 20:07:45.272579247 +0000 UTC m=+173.019499321" watchObservedRunningTime="2026-04-20 20:07:45.272825255 +0000 UTC m=+173.019745325" Apr 20 20:07:53.254831 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:53.254803 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j7twh" Apr 20 20:07:53.713762 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:53.713695 2575 patch_prober.go:28] interesting pod/image-registry-7db79d6bf5-nq2dr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:07:53.713762 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:53.713751 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:07:55.220194 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:55.220162 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:07:56.984857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.984827 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dc2vn"] Apr 20 20:07:56.990719 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.990689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:56.993165 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.993142 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:07:56.993165 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.993156 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:07:56.993385 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.993207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:07:56.993385 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.993306 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:07:56.993606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.993593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h4n7h\"" Apr 20 20:07:56.994455 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.994426 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:07:56.994455 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:56.994447 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:07:57.008417 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008500 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-textfile\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008500 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008571 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-metrics-client-ca\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008571 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dpw\" (UniqueName: \"kubernetes.io/projected/164da49e-606e-49b0-9bd5-8181919bd837-kube-api-access-99dpw\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008631 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008631 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-sys\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008694 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-root\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.008694 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.008652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-wtmp\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109336 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-metrics-client-ca\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99dpw\" (UniqueName: \"kubernetes.io/projected/164da49e-606e-49b0-9bd5-8181919bd837-kube-api-access-99dpw\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-sys\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-root\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109459 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-wtmp\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:57.109436 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-textfile\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-root\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:07:57.109556 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls podName:164da49e-606e-49b0-9bd5-8181919bd837 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.609533877 +0000 UTC m=+185.356453939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls") pod "node-exporter-dc2vn" (UID: "164da49e-606e-49b0-9bd5-8181919bd837") : secret "node-exporter-tls" not found Apr 20 20:07:57.109744 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-sys\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109957 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-wtmp\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.109957 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-textfile\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.110024 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.109991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-metrics-client-ca\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.110066 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.110021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-accelerators-collector-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.111724 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.111706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.117837 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.117815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dpw\" (UniqueName: \"kubernetes.io/projected/164da49e-606e-49b0-9bd5-8181919bd837-kube-api-access-99dpw\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.613390 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.613364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.615503 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.615488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/164da49e-606e-49b0-9bd5-8181919bd837-node-exporter-tls\") pod \"node-exporter-dc2vn\" (UID: \"164da49e-606e-49b0-9bd5-8181919bd837\") " pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.900006 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:57.899948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dc2vn" Apr 20 20:07:57.907601 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:07:57.907581 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164da49e_606e_49b0_9bd5_8181919bd837.slice/crio-015cff7a75a44391d99d73bdb0921f5c9138c796969f4e4e1bc6bccbec87fcec WatchSource:0}: Error finding container 015cff7a75a44391d99d73bdb0921f5c9138c796969f4e4e1bc6bccbec87fcec: Status 404 returned error can't find the container with id 015cff7a75a44391d99d73bdb0921f5c9138c796969f4e4e1bc6bccbec87fcec Apr 20 20:07:58.285863 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:58.285829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc2vn" event={"ID":"164da49e-606e-49b0-9bd5-8181919bd837","Type":"ContainerStarted","Data":"015cff7a75a44391d99d73bdb0921f5c9138c796969f4e4e1bc6bccbec87fcec"} Apr 20 20:07:59.289141 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:59.289104 2575 generic.go:358] "Generic (PLEG): container finished" podID="164da49e-606e-49b0-9bd5-8181919bd837" containerID="18dc599a137d8a1bf98b3f3284a1a87e0f74bc60fa8eb182debaa8630fc4f02c" exitCode=0 Apr 20 20:07:59.289483 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:07:59.289164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc2vn" event={"ID":"164da49e-606e-49b0-9bd5-8181919bd837","Type":"ContainerDied","Data":"18dc599a137d8a1bf98b3f3284a1a87e0f74bc60fa8eb182debaa8630fc4f02c"} Apr 20 20:08:00.293857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:00.293823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc2vn" event={"ID":"164da49e-606e-49b0-9bd5-8181919bd837","Type":"ContainerStarted","Data":"dd0f2bdf08e98cbc2afb8a33461538ee772d7ccfc1f1812112874b9cd69901b9"} Apr 20 20:08:00.293857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:00.293860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dc2vn" event={"ID":"164da49e-606e-49b0-9bd5-8181919bd837","Type":"ContainerStarted","Data":"b35aab7cd988290c9f62f242a9a612a19aaac352b5c11dcad203a050a12cf13f"} Apr 20 20:08:00.317946 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:00.317888 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dc2vn" podStartSLOduration=3.628807254 podStartE2EDuration="4.317864363s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:57.909140459 +0000 UTC m=+185.656060507" lastFinishedPulling="2026-04-20 20:07:58.598197564 +0000 UTC m=+186.345117616" observedRunningTime="2026-04-20 20:08:00.316608323 +0000 UTC m=+188.063528411" watchObservedRunningTime="2026-04-20 20:08:00.317864363 +0000 UTC m=+188.064784434" Apr 20 20:08:03.629387 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:03.629357 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:08:26.039730 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:26.039693 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" podUID="45b9b480-0f24-4767-bcf2-c039e9306050" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:08:28.651442 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.651397 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerName="registry" containerID="cri-o://fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2" gracePeriod=30 Apr 20 20:08:28.888812 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.888791 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:08:28.923755 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.923694 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.923755 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.923735 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmt6\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.923923 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.923781 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924186 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924060 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924186 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924123 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924346 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924227 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924346 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924270 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:28.924346 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924286 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924346 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924290 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:28.924346 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924317 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration\") pod \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\" (UID: \"39251b0b-f5d0-45a9-bc58-4e02fa4a556d\") " Apr 20 20:08:28.924674 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924656 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-certificates\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:28.924743 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.924680 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-trusted-ca\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:28.926790 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.926758 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:28.926928 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.926906 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:28.927075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.927047 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:28.927354 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.927332 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:28.927506 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.927480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6" (OuterVolumeSpecName: "kube-api-access-6rmt6") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "kube-api-access-6rmt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:28.935032 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:28.935011 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "39251b0b-f5d0-45a9-bc58-4e02fa4a556d" (UID: "39251b0b-f5d0-45a9-bc58-4e02fa4a556d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:08:29.025888 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025858 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-installation-pull-secrets\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.025888 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025886 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-bound-sa-token\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.026005 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025896 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-image-registry-private-configuration\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.026005 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025936 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rmt6\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-kube-api-access-6rmt6\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.026005 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025945 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-ca-trust-extracted\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.026005 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.025954 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39251b0b-f5d0-45a9-bc58-4e02fa4a556d-registry-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:08:29.370873 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.370844 2575 generic.go:358] "Generic (PLEG): container finished" podID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerID="fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2" exitCode=0 Apr 20 20:08:29.371006 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.370887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" event={"ID":"39251b0b-f5d0-45a9-bc58-4e02fa4a556d","Type":"ContainerDied","Data":"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2"} Apr 20 20:08:29.371006 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.370906 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" Apr 20 20:08:29.371006 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.370932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7db79d6bf5-nq2dr" event={"ID":"39251b0b-f5d0-45a9-bc58-4e02fa4a556d","Type":"ContainerDied","Data":"49d818a1d877c3913e3db4f6ef8e8bc87d6ca6d7321c30c8b81ca775d85faa8e"} Apr 20 20:08:29.371006 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.370951 2575 scope.go:117] "RemoveContainer" containerID="fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2" Apr 20 20:08:29.378785 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.378756 2575 scope.go:117] "RemoveContainer" containerID="fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2" Apr 20 20:08:29.379035 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:08:29.379013 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2\": container with ID starting with fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2 not found: ID does not exist" containerID="fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2" Apr 20 20:08:29.379091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.379044 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2"} err="failed to get container status \"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2\": rpc error: code = NotFound desc = could not find container \"fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2\": container with ID starting with fb104ea50aa79641864f81fd62207f3cb89beb16301132bad802ac761e0b49e2 not found: ID does not exist" Apr 20 20:08:29.394598 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.394574 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:08:29.404313 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:29.404294 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7db79d6bf5-nq2dr"] Apr 20 20:08:30.731393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:30.731348 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" path="/var/lib/kubelet/pods/39251b0b-f5d0-45a9-bc58-4e02fa4a556d/volumes" Apr 20 20:08:36.040021 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:36.039987 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" podUID="45b9b480-0f24-4767-bcf2-c039e9306050" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:08:46.040363 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.040319 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" podUID="45b9b480-0f24-4767-bcf2-c039e9306050" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:08:46.040752 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.040403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" Apr 20 20:08:46.040928 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.040906 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"73628eb8e932d35e10d3400494f0aec5c5bb0d3092f073959ff6605a7b4b8990"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 20:08:46.040966 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.040951 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" podUID="45b9b480-0f24-4767-bcf2-c039e9306050" containerName="service-proxy" containerID="cri-o://73628eb8e932d35e10d3400494f0aec5c5bb0d3092f073959ff6605a7b4b8990" gracePeriod=30 Apr 20 20:08:46.418344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.418268 2575 generic.go:358] "Generic (PLEG): container finished" podID="45b9b480-0f24-4767-bcf2-c039e9306050" containerID="73628eb8e932d35e10d3400494f0aec5c5bb0d3092f073959ff6605a7b4b8990" exitCode=2 Apr 20 20:08:46.418344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.418282 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerDied","Data":"73628eb8e932d35e10d3400494f0aec5c5bb0d3092f073959ff6605a7b4b8990"} Apr 20 20:08:46.418344 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:08:46.418318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67886544cd-bjjtk" event={"ID":"45b9b480-0f24-4767-bcf2-c039e9306050","Type":"ContainerStarted","Data":"0365cda8b64a43d3abddc32159c0ce6dda3fd9e6e22595049cd16b5b5e112994"} Apr 20 20:09:04.562543 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:04.562506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:09:04.564792 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:04.564771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de77e01d-c1e5-4a7e-99df-1261a9d21bed-metrics-certs\") pod \"network-metrics-daemon-gf2gx\" (UID: \"de77e01d-c1e5-4a7e-99df-1261a9d21bed\") " pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:09:04.630693 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:04.630668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:09:04.638767 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:04.638749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gf2gx" Apr 20 20:09:04.747975 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:04.747947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gf2gx"] Apr 20 20:09:04.750804 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:09:04.750776 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde77e01d_c1e5_4a7e_99df_1261a9d21bed.slice/crio-f1262f4c81e25ac5a0852abe682574c4b7c71defe1145e197574da2e51515551 WatchSource:0}: Error finding container f1262f4c81e25ac5a0852abe682574c4b7c71defe1145e197574da2e51515551: Status 404 returned error can't find the container with id f1262f4c81e25ac5a0852abe682574c4b7c71defe1145e197574da2e51515551 Apr 20 20:09:05.469895 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:05.469814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gf2gx" event={"ID":"de77e01d-c1e5-4a7e-99df-1261a9d21bed","Type":"ContainerStarted","Data":"f1262f4c81e25ac5a0852abe682574c4b7c71defe1145e197574da2e51515551"} Apr 20 20:09:06.474440 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:06.474406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gf2gx" event={"ID":"de77e01d-c1e5-4a7e-99df-1261a9d21bed","Type":"ContainerStarted","Data":"9b9630e59674a91463f4644b01efa2e5e2ccff03caded31f23144aeb15a8e553"} Apr 20 20:09:06.474828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:06.474447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gf2gx" event={"ID":"de77e01d-c1e5-4a7e-99df-1261a9d21bed","Type":"ContainerStarted","Data":"303f89a6e3eb59728c34c674a108a5ce9f8f79c6322b9f825967c36dd21cc5fb"} Apr 20 20:09:06.492003 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:06.491955 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gf2gx" podStartSLOduration=252.569691394 podStartE2EDuration="4m13.491942501s" podCreationTimestamp="2026-04-20 20:04:53 +0000 UTC" firstStartedPulling="2026-04-20 20:09:04.752632621 +0000 UTC m=+252.499552670" lastFinishedPulling="2026-04-20 20:09:05.674883728 +0000 UTC m=+253.421803777" observedRunningTime="2026-04-20 20:09:06.490418885 +0000 UTC m=+254.237338974" watchObservedRunningTime="2026-04-20 20:09:06.491942501 +0000 UTC m=+254.238862572" Apr 20 20:09:52.642469 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:52.642445 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:09:52.642469 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:52.642455 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:09:52.648896 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:09:52.648871 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:34.862160 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.862125 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns"] Apr 20 20:10:34.862628 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.862366 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerName="registry" Apr 20 20:10:34.862628 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.862378 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerName="registry" Apr 20 20:10:34.862628 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.862436 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="39251b0b-f5d0-45a9-bc58-4e02fa4a556d" containerName="registry" Apr 20 20:10:34.865065 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.865046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:34.867450 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.867422 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 20 20:10:34.867584 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.867450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zrwkg\"" Apr 20 20:10:34.868484 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.868462 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 20 20:10:34.868484 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.868474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 20 20:10:34.868484 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.868484 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 20 20:10:34.868663 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.868531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 20 20:10:34.874954 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.874931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns"] Apr 20 20:10:34.925960 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.925935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:34.926053 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.925966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnjf\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-kube-api-access-9jnjf\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:34.926053 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:34.925989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.026989 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.026969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.027114 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.026999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnjf\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-kube-api-access-9jnjf\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.027114 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.027021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.027227 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:10:35.027122 2575 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:10:35.027227 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:10:35.027143 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:10:35.027227 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:10:35.027167 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns: references non-existent secret key: tls.crt Apr 20 20:10:35.027397 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:10:35.027229 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates podName:6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43 nodeName:}" failed. No retries permitted until 2026-04-20 20:10:35.527209732 +0000 UTC m=+343.274129798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates") pod "keda-metrics-apiserver-7c9f485588-jqzns" (UID: "6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43") : references non-existent secret key: tls.crt Apr 20 20:10:35.027397 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.027314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.035077 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.035055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnjf\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-kube-api-access-9jnjf\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.531280 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.531237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.533562 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.533534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jqzns\" (UID: \"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.775029 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.775005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:35.886831 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.886803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns"] Apr 20 20:10:35.889633 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:10:35.889606 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1a21a9_dcf7_4af4_b0e3_d7efd0978e43.slice/crio-cd7aed315a3e27589c41de67db99aab0918fa6ff76f044122f4c99f2accf3cba WatchSource:0}: Error finding container cd7aed315a3e27589c41de67db99aab0918fa6ff76f044122f4c99f2accf3cba: Status 404 returned error can't find the container with id cd7aed315a3e27589c41de67db99aab0918fa6ff76f044122f4c99f2accf3cba Apr 20 20:10:35.890967 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:35.890947 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:10:36.730915 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:36.730885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" event={"ID":"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43","Type":"ContainerStarted","Data":"cd7aed315a3e27589c41de67db99aab0918fa6ff76f044122f4c99f2accf3cba"} Apr 20 20:10:40.744118 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:40.744083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" event={"ID":"6d1a21a9-dcf7-4af4-b0e3-d7efd0978e43","Type":"ContainerStarted","Data":"5e6e2fa555459c08600cbe058c501c018cd9bfb4048fe3f89ae071d620531b28"} Apr 20 20:10:40.744484 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:40.744208 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:10:40.762144 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:40.762102 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" podStartSLOduration=2.637202387 podStartE2EDuration="6.76209252s" podCreationTimestamp="2026-04-20 20:10:34 +0000 UTC" firstStartedPulling="2026-04-20 20:10:35.891129151 +0000 UTC m=+343.638049203" lastFinishedPulling="2026-04-20 20:10:40.016019282 +0000 UTC m=+347.762939336" observedRunningTime="2026-04-20 20:10:40.761037634 +0000 UTC m=+348.507957719" watchObservedRunningTime="2026-04-20 20:10:40.76209252 +0000 UTC m=+348.509012591" Apr 20 20:10:51.751146 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:10:51.751109 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jqzns" Apr 20 20:11:42.368050 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.368018 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-d4z2x"] Apr 20 20:11:42.370999 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.370984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.373988 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.373954 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 20 20:11:42.373988 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.373954 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:11:42.374237 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.374208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8lht9\"" Apr 20 20:11:42.375207 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.375185 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:11:42.383489 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.383469 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-d4z2x"] Apr 20 20:11:42.446745 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.446721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxj7x\" (UniqueName: \"kubernetes.io/projected/10c64b38-1d2a-4779-9775-beb30405b81f-kube-api-access-bxj7x\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.446843 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.446762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/10c64b38-1d2a-4779-9775-beb30405b81f-data\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.547203 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.547179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxj7x\" (UniqueName: \"kubernetes.io/projected/10c64b38-1d2a-4779-9775-beb30405b81f-kube-api-access-bxj7x\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.547308 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.547217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/10c64b38-1d2a-4779-9775-beb30405b81f-data\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.547522 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.547507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/10c64b38-1d2a-4779-9775-beb30405b81f-data\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.554962 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.554943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxj7x\" (UniqueName: \"kubernetes.io/projected/10c64b38-1d2a-4779-9775-beb30405b81f-kube-api-access-bxj7x\") pod \"seaweedfs-86cc847c5c-d4z2x\" (UID: \"10c64b38-1d2a-4779-9775-beb30405b81f\") " pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.680450 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.680379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:42.799651 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.799630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-d4z2x"] Apr 20 20:11:42.802010 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:11:42.801982 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c64b38_1d2a_4779_9775_beb30405b81f.slice/crio-cfe792d3e905a0dc893b24e8da7748b3ff38aa9a33f847318bf4df6e2e327778 WatchSource:0}: Error finding container cfe792d3e905a0dc893b24e8da7748b3ff38aa9a33f847318bf4df6e2e327778: Status 404 returned error can't find the container with id cfe792d3e905a0dc893b24e8da7748b3ff38aa9a33f847318bf4df6e2e327778 Apr 20 20:11:42.909647 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:42.909619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-d4z2x" event={"ID":"10c64b38-1d2a-4779-9775-beb30405b81f","Type":"ContainerStarted","Data":"cfe792d3e905a0dc893b24e8da7748b3ff38aa9a33f847318bf4df6e2e327778"} Apr 20 20:11:46.923584 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:46.923547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-d4z2x" event={"ID":"10c64b38-1d2a-4779-9775-beb30405b81f","Type":"ContainerStarted","Data":"4ae7583fa0cd09151c6f43437db94d6bde72e1910876bf8a354f1277a39691c2"} Apr 20 20:11:46.924031 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:46.923758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:11:46.939365 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:46.939319 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-d4z2x" podStartSLOduration=1.7376027920000001 podStartE2EDuration="4.939308537s" podCreationTimestamp="2026-04-20 20:11:42 +0000 UTC" firstStartedPulling="2026-04-20 20:11:42.803162917 +0000 UTC m=+410.550082965" lastFinishedPulling="2026-04-20 20:11:46.004868659 +0000 UTC m=+413.751788710" observedRunningTime="2026-04-20 20:11:46.938611719 +0000 UTC m=+414.685531789" watchObservedRunningTime="2026-04-20 20:11:46.939308537 +0000 UTC m=+414.686228611" Apr 20 20:11:52.929317 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:11:52.929284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-d4z2x" Apr 20 20:12:53.457756 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.457727 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-hdwvb"] Apr 20 20:12:53.459657 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.459641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.462348 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.462327 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 20 20:12:53.462451 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.462369 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9wgtx\"" Apr 20 20:12:53.470880 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.470859 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hdwvb"] Apr 20 20:12:53.473651 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.473629 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-2f9qj"] Apr 20 20:12:53.475610 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.475590 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.478104 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.478086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-zh5rk\"" Apr 20 20:12:53.478187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.478103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 20 20:12:53.485229 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.485207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2f9qj"] Apr 20 20:12:53.598209 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.598186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7d88\" (UniqueName: \"kubernetes.io/projected/a85e939b-9f28-4c09-80c6-c476a342044c-kube-api-access-b7d88\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.598318 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.598214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttt6g\" (UniqueName: \"kubernetes.io/projected/5e9c78a9-f304-4c12-be6d-e096d460953e-kube-api-access-ttt6g\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.598318 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.598234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a85e939b-9f28-4c09-80c6-c476a342044c-tls-certs\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.598387 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.598364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.699576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.699549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.699665 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.699586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7d88\" (UniqueName: \"kubernetes.io/projected/a85e939b-9f28-4c09-80c6-c476a342044c-kube-api-access-b7d88\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.699665 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.699604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttt6g\" (UniqueName: \"kubernetes.io/projected/5e9c78a9-f304-4c12-be6d-e096d460953e-kube-api-access-ttt6g\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.699741 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:12:53.699692 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 20:12:53.699741 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.699706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a85e939b-9f28-4c09-80c6-c476a342044c-tls-certs\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.699820 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:12:53.699762 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert podName:5e9c78a9-f304-4c12-be6d-e096d460953e nodeName:}" failed. No retries permitted until 2026-04-20 20:12:54.199744051 +0000 UTC m=+481.946664105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert") pod "odh-model-controller-696fc77849-2f9qj" (UID: "5e9c78a9-f304-4c12-be6d-e096d460953e") : secret "odh-model-controller-webhook-cert" not found Apr 20 20:12:53.702163 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.702136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a85e939b-9f28-4c09-80c6-c476a342044c-tls-certs\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.708733 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.708677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7d88\" (UniqueName: \"kubernetes.io/projected/a85e939b-9f28-4c09-80c6-c476a342044c-kube-api-access-b7d88\") pod \"model-serving-api-86f7b4b499-hdwvb\" (UID: \"a85e939b-9f28-4c09-80c6-c476a342044c\") " pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.708817 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.708798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttt6g\" (UniqueName: \"kubernetes.io/projected/5e9c78a9-f304-4c12-be6d-e096d460953e-kube-api-access-ttt6g\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:53.769677 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.769659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:53.884290 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:53.884262 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hdwvb"] Apr 20 20:12:53.887134 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:12:53.887106 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85e939b_9f28_4c09_80c6_c476a342044c.slice/crio-44d0b24e78dd2363c4f3b636c4d5fb6bfef081c0ae27f6e38470bf6f48d6dd2a WatchSource:0}: Error finding container 44d0b24e78dd2363c4f3b636c4d5fb6bfef081c0ae27f6e38470bf6f48d6dd2a: Status 404 returned error can't find the container with id 44d0b24e78dd2363c4f3b636c4d5fb6bfef081c0ae27f6e38470bf6f48d6dd2a Apr 20 20:12:54.107980 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:54.107943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hdwvb" event={"ID":"a85e939b-9f28-4c09-80c6-c476a342044c","Type":"ContainerStarted","Data":"44d0b24e78dd2363c4f3b636c4d5fb6bfef081c0ae27f6e38470bf6f48d6dd2a"} Apr 20 20:12:54.204079 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:54.204050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:54.206432 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:54.206405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e9c78a9-f304-4c12-be6d-e096d460953e-cert\") pod \"odh-model-controller-696fc77849-2f9qj\" (UID: \"5e9c78a9-f304-4c12-be6d-e096d460953e\") " pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:54.384755 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:54.384693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:54.498765 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:54.498736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2f9qj"] Apr 20 20:12:54.501811 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:12:54.501786 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9c78a9_f304_4c12_be6d_e096d460953e.slice/crio-5d8bce55db0b9aeb1fbef4bc644ceea9bce5ba110d7b8bd17bc657e5ee833c2b WatchSource:0}: Error finding container 5d8bce55db0b9aeb1fbef4bc644ceea9bce5ba110d7b8bd17bc657e5ee833c2b: Status 404 returned error can't find the container with id 5d8bce55db0b9aeb1fbef4bc644ceea9bce5ba110d7b8bd17bc657e5ee833c2b Apr 20 20:12:55.113395 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:55.113358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2f9qj" event={"ID":"5e9c78a9-f304-4c12-be6d-e096d460953e","Type":"ContainerStarted","Data":"5d8bce55db0b9aeb1fbef4bc644ceea9bce5ba110d7b8bd17bc657e5ee833c2b"} Apr 20 20:12:58.124080 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.124033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hdwvb" event={"ID":"a85e939b-9f28-4c09-80c6-c476a342044c","Type":"ContainerStarted","Data":"b0f801250934ef7be1bf801e8f39718f103f90788fffa79e6b323ef7b8120f17"} Apr 20 20:12:58.124512 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.124140 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:12:58.125466 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.125441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2f9qj" event={"ID":"5e9c78a9-f304-4c12-be6d-e096d460953e","Type":"ContainerStarted","Data":"189e430a8dcd68518e50becc273dcffa67bbb8fdbe7988dabaa7fc47759432ee"} Apr 20 20:12:58.125564 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.125547 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:12:58.146057 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.146013 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-hdwvb" podStartSLOduration=1.439289051 podStartE2EDuration="5.146000512s" podCreationTimestamp="2026-04-20 20:12:53 +0000 UTC" firstStartedPulling="2026-04-20 20:12:53.888831666 +0000 UTC m=+481.635751715" lastFinishedPulling="2026-04-20 20:12:57.595543114 +0000 UTC m=+485.342463176" observedRunningTime="2026-04-20 20:12:58.144671428 +0000 UTC m=+485.891591514" watchObservedRunningTime="2026-04-20 20:12:58.146000512 +0000 UTC m=+485.892920582" Apr 20 20:12:58.165320 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:12:58.165283 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-2f9qj" podStartSLOduration=2.061090621 podStartE2EDuration="5.165272172s" podCreationTimestamp="2026-04-20 20:12:53 +0000 UTC" firstStartedPulling="2026-04-20 20:12:54.503095606 +0000 UTC m=+482.250015656" lastFinishedPulling="2026-04-20 20:12:57.607277142 +0000 UTC m=+485.354197207" observedRunningTime="2026-04-20 20:12:58.164296151 +0000 UTC m=+485.911216222" watchObservedRunningTime="2026-04-20 20:12:58.165272172 +0000 UTC m=+485.912192240" Apr 20 20:13:09.130926 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:09.130899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-2f9qj" Apr 20 20:13:09.132990 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:09.132970 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-hdwvb" Apr 20 20:13:30.067163 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.067086 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:13:30.100599 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.100572 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:13:30.100730 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.100692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.103303 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.103279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:13:30.103433 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.103288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 20 20:13:30.103506 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.103493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 20 20:13:30.103654 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.103636 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lqk4h\"" Apr 20 20:13:30.103753 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.103688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:13:30.150057 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.150036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.150172 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.150098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72cs\" (UniqueName: \"kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.150218 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.150173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.150218 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.150201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.154749 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.154730 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:13:30.175187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.175163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:13:30.175297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.175283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.177622 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.177603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-127f1-predictor-serving-cert\"" Apr 20 20:13:30.177811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.177795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-127f1-kube-rbac-proxy-sar-config\"" Apr 20 20:13:30.250969 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.250942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.251094 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.250976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.251094 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.250999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.251094 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchwg\" (UniqueName: \"kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.251264 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.251264 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.251264 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f72cs\" (UniqueName: \"kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.251474 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.251821 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.251800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.253815 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.253795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.260116 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.260092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72cs\" (UniqueName: \"kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.351687 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.351624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.351793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.351690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.351793 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.351722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nchwg\" (UniqueName: \"kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.351903 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:13:30.351785 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-127f1-predictor-serving-cert: secret "success-200-isvc-127f1-predictor-serving-cert" not found Apr 20 20:13:30.351903 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:13:30.351862 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls podName:e6cb64b5-1bab-48c6-997f-4dddb3884c8d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:30.851841788 +0000 UTC m=+518.598761843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls") pod "success-200-isvc-127f1-predictor-68b975dccb-dcphk" (UID: "e6cb64b5-1bab-48c6-997f-4dddb3884c8d") : secret "success-200-isvc-127f1-predictor-serving-cert" not found Apr 20 20:13:30.352367 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.352343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.362957 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.362934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchwg\" (UniqueName: \"kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.411918 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.411897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:30.531471 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.531447 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:13:30.533385 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:13:30.533357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd251e1b5_ef06_472e_a8d0_6d5641b1a9e6.slice/crio-b1d7d65ea9abef444d16a900efb8439a0a19b26580de7d5924eea05966ab7c18 WatchSource:0}: Error finding container b1d7d65ea9abef444d16a900efb8439a0a19b26580de7d5924eea05966ab7c18: Status 404 returned error can't find the container with id b1d7d65ea9abef444d16a900efb8439a0a19b26580de7d5924eea05966ab7c18 Apr 20 20:13:30.856990 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.856957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:30.859149 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:30.859123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") pod \"success-200-isvc-127f1-predictor-68b975dccb-dcphk\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:31.085269 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.085221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:31.155946 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.155921 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:13:31.169022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.168997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.170130 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.170103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:13:31.173585 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.173559 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 20 20:13:31.173696 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.173593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 20 20:13:31.207990 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.207963 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:13:31.211138 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:13:31.211095 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cb64b5_1bab_48c6_997f_4dddb3884c8d.slice/crio-71fe5cd4d5fc661f4cbfa1a0938e11653162050978151d865515beb0ca8f98a0 WatchSource:0}: Error finding container 71fe5cd4d5fc661f4cbfa1a0938e11653162050978151d865515beb0ca8f98a0: Status 404 returned error can't find the container with id 71fe5cd4d5fc661f4cbfa1a0938e11653162050978151d865515beb0ca8f98a0 Apr 20 20:13:31.225008 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.224968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerStarted","Data":"71fe5cd4d5fc661f4cbfa1a0938e11653162050978151d865515beb0ca8f98a0"} Apr 20 20:13:31.226476 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.226446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerStarted","Data":"b1d7d65ea9abef444d16a900efb8439a0a19b26580de7d5924eea05966ab7c18"} Apr 20 20:13:31.259415 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.259381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.259512 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.259424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.259572 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.259544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.259818 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.259591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwdl\" (UniqueName: \"kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.360505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.360071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.360505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.360126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwdl\" (UniqueName: \"kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.360505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.360179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.360505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.360208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.360839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.360814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.361706 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.361665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.363839 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.363815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.369386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.369361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwdl\" (UniqueName: \"kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.481898 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.481619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:31.619639 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:31.619561 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:13:31.622393 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:13:31.622280 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabf688b_196b_4f63_9154_7b6bb4fc446c.slice/crio-723243ae4740d7f76a03a5819e3104927723f316a016db7b35bbf2525608c7b7 WatchSource:0}: Error finding container 723243ae4740d7f76a03a5819e3104927723f316a016db7b35bbf2525608c7b7: Status 404 returned error can't find the container with id 723243ae4740d7f76a03a5819e3104927723f316a016db7b35bbf2525608c7b7 Apr 20 20:13:32.237564 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:32.237511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerStarted","Data":"723243ae4740d7f76a03a5819e3104927723f316a016db7b35bbf2525608c7b7"} Apr 20 20:13:45.296638 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:45.296598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerStarted","Data":"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514"} Apr 20 20:13:45.298192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:45.298158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerStarted","Data":"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c"} Apr 20 20:13:45.299657 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:45.299635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerStarted","Data":"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9"} Apr 20 20:13:48.310243 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:48.310203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerStarted","Data":"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f"} Apr 20 20:13:48.310677 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:48.310336 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:48.310677 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:48.310421 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:48.311549 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:48.311521 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:13:48.328109 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:48.328067 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podStartSLOduration=2.078701849 podStartE2EDuration="18.32805205s" podCreationTimestamp="2026-04-20 20:13:30 +0000 UTC" firstStartedPulling="2026-04-20 20:13:31.213338017 +0000 UTC m=+518.960258071" lastFinishedPulling="2026-04-20 20:13:47.462688224 +0000 UTC m=+535.209608272" observedRunningTime="2026-04-20 20:13:48.327409879 +0000 UTC m=+536.074329965" watchObservedRunningTime="2026-04-20 20:13:48.32805205 +0000 UTC m=+536.074972121" Apr 20 20:13:49.314904 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:49.314872 2575 generic.go:358] "Generic (PLEG): container finished" podID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerID="c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9" exitCode=0 Apr 20 20:13:49.315292 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:49.314946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerDied","Data":"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9"} Apr 20 20:13:49.316545 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:49.316502 2575 generic.go:358] "Generic (PLEG): container finished" podID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerID="a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c" exitCode=0 Apr 20 20:13:49.316613 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:49.316580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerDied","Data":"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c"} Apr 20 20:13:49.316835 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:49.316808 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:13:54.322665 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:54.322625 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:13:54.323210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:54.323184 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:13:57.342222 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.342183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerStarted","Data":"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163"} Apr 20 20:13:57.342707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.342235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerStarted","Data":"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3"} Apr 20 20:13:57.342707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.342555 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:57.342707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.342696 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:13:57.344283 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.344236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:13:57.344283 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.344275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerStarted","Data":"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7"} Apr 20 20:13:57.344441 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.344302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerStarted","Data":"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3"} Apr 20 20:13:57.344628 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.344609 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:57.344689 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.344635 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:13:57.345638 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.345616 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:13:57.364171 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.364124 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podStartSLOduration=1.181729701 podStartE2EDuration="26.364111379s" podCreationTimestamp="2026-04-20 20:13:31 +0000 UTC" firstStartedPulling="2026-04-20 20:13:31.625342557 +0000 UTC m=+519.372262605" lastFinishedPulling="2026-04-20 20:13:56.807724234 +0000 UTC m=+544.554644283" observedRunningTime="2026-04-20 20:13:57.362527138 +0000 UTC m=+545.109447208" watchObservedRunningTime="2026-04-20 20:13:57.364111379 +0000 UTC m=+545.111031450" Apr 20 20:13:57.385306 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:57.385238 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podStartSLOduration=1.0998233530000001 podStartE2EDuration="27.385226487s" podCreationTimestamp="2026-04-20 20:13:30 +0000 UTC" firstStartedPulling="2026-04-20 20:13:30.535323146 +0000 UTC m=+518.282243195" lastFinishedPulling="2026-04-20 20:13:56.820726274 +0000 UTC m=+544.567646329" observedRunningTime="2026-04-20 20:13:57.384695756 +0000 UTC m=+545.131615827" watchObservedRunningTime="2026-04-20 20:13:57.385226487 +0000 UTC m=+545.132146559" Apr 20 20:13:58.347685 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:58.347645 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:13:58.348070 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:13:58.347645 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:03.351698 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:03.351669 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:14:03.352074 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:03.351734 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:14:03.352207 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:03.352181 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:03.352293 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:03.352177 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:04.324173 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:04.324128 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:14:13.352379 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:13.352337 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:13.352918 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:13.352337 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:14.323689 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:14.323652 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:14:23.352282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:23.352203 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:23.352282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:23.352207 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:24.323465 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:24.323418 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 20 20:14:33.352668 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:33.352625 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:33.353129 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:33.352625 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:34.324028 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:34.323996 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:14:43.352686 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:43.352649 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:43.353091 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:43.352646 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:14:52.661631 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:52.661604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:14:52.664077 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:52.664055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:14:53.352377 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:53.352332 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:14:53.352555 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:14:53.352348 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:15:00.299786 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.299753 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:15:00.300199 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.300036 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" containerID="cri-o://53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514" gracePeriod=30 Apr 20 20:15:00.300199 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.300071 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kube-rbac-proxy" containerID="cri-o://9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f" gracePeriod=30 Apr 20 20:15:00.339534 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.339506 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:15:00.342971 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.342954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.345922 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.345895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9290e-predictor-serving-cert\"" Apr 20 20:15:00.346391 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.346374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9290e-kube-rbac-proxy-sar-config\"" Apr 20 20:15:00.361972 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.361947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:15:00.507266 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.507227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.507398 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.507295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.507398 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.507327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spg5\" (UniqueName: \"kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.530573 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.530543 2575 generic.go:358] "Generic (PLEG): container finished" podID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerID="9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f" exitCode=2 Apr 20 20:15:00.530701 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.530576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerDied","Data":"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f"} Apr 20 20:15:00.607764 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.607698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.607764 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.607740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.607923 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:00.607831 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-9290e-predictor-serving-cert: secret "success-200-isvc-9290e-predictor-serving-cert" not found Apr 20 20:15:00.607923 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.607864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4spg5\" (UniqueName: \"kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.607923 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:00.607889 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls podName:79498564-cc7d-4361-9213-90d899860d1e nodeName:}" failed. No retries permitted until 2026-04-20 20:15:01.107866652 +0000 UTC m=+608.854786700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls") pod "success-200-isvc-9290e-predictor-5b44755bcb-smm7n" (UID: "79498564-cc7d-4361-9213-90d899860d1e") : secret "success-200-isvc-9290e-predictor-serving-cert" not found Apr 20 20:15:00.608364 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.608347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:00.628960 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:00.628935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spg5\" (UniqueName: \"kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:01.112120 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.112074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:01.114491 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.114464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") pod \"success-200-isvc-9290e-predictor-5b44755bcb-smm7n\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:01.252515 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.252467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:01.368842 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.368821 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:15:01.370966 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:15:01.370936 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79498564_cc7d_4361_9213_90d899860d1e.slice/crio-b5156ab293644b7e33a86e22bad754a699930e32d8e0c3054516f91347d234f4 WatchSource:0}: Error finding container b5156ab293644b7e33a86e22bad754a699930e32d8e0c3054516f91347d234f4: Status 404 returned error can't find the container with id b5156ab293644b7e33a86e22bad754a699930e32d8e0c3054516f91347d234f4 Apr 20 20:15:01.535160 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.535124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerStarted","Data":"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d"} Apr 20 20:15:01.535305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.535172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerStarted","Data":"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120"} Apr 20 20:15:01.535305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.535191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerStarted","Data":"b5156ab293644b7e33a86e22bad754a699930e32d8e0c3054516f91347d234f4"} Apr 20 20:15:01.535305 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.535289 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:01.551358 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:01.551315 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podStartSLOduration=1.551297511 podStartE2EDuration="1.551297511s" podCreationTimestamp="2026-04-20 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:01.550713569 +0000 UTC m=+609.297633641" watchObservedRunningTime="2026-04-20 20:15:01.551297511 +0000 UTC m=+609.298217582" Apr 20 20:15:02.539426 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:02.539391 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:02.540759 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:02.540730 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:03.145246 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.145225 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:15:03.325660 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.325631 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nchwg\" (UniqueName: \"kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg\") pod \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " Apr 20 20:15:03.325815 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.325690 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") pod \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " Apr 20 20:15:03.325815 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.325711 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\" (UID: \"e6cb64b5-1bab-48c6-997f-4dddb3884c8d\") " Apr 20 20:15:03.326087 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.326054 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-127f1-kube-rbac-proxy-sar-config") pod "e6cb64b5-1bab-48c6-997f-4dddb3884c8d" (UID: "e6cb64b5-1bab-48c6-997f-4dddb3884c8d"). InnerVolumeSpecName "success-200-isvc-127f1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:03.327969 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.327936 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e6cb64b5-1bab-48c6-997f-4dddb3884c8d" (UID: "e6cb64b5-1bab-48c6-997f-4dddb3884c8d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:03.328052 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.328024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg" (OuterVolumeSpecName: "kube-api-access-nchwg") pod "e6cb64b5-1bab-48c6-997f-4dddb3884c8d" (UID: "e6cb64b5-1bab-48c6-997f-4dddb3884c8d"). InnerVolumeSpecName "kube-api-access-nchwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:03.353464 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.353437 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:15:03.353577 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.353512 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:15:03.427057 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.427034 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nchwg\" (UniqueName: \"kubernetes.io/projected/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-kube-api-access-nchwg\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.427169 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.427061 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.427169 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.427080 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6cb64b5-1bab-48c6-997f-4dddb3884c8d-success-200-isvc-127f1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.543291 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543265 2575 generic.go:358] "Generic (PLEG): container finished" podID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerID="53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514" exitCode=0 Apr 20 20:15:03.543593 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerDied","Data":"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514"} Apr 20 20:15:03.543593 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543366 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" Apr 20 20:15:03.543593 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk" event={"ID":"e6cb64b5-1bab-48c6-997f-4dddb3884c8d","Type":"ContainerDied","Data":"71fe5cd4d5fc661f4cbfa1a0938e11653162050978151d865515beb0ca8f98a0"} Apr 20 20:15:03.543593 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543407 2575 scope.go:117] "RemoveContainer" containerID="9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f" Apr 20 20:15:03.544001 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.543970 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:03.551271 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.551166 2575 scope.go:117] "RemoveContainer" containerID="53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514" Apr 20 20:15:03.557726 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.557710 2575 scope.go:117] "RemoveContainer" containerID="9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f" Apr 20 20:15:03.557963 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:03.557943 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f\": container with ID starting with 9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f not found: ID does not exist" containerID="9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f" Apr 20 20:15:03.558020 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.557969 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f"} err="failed to get container status \"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f\": rpc error: code = NotFound desc = could not find container \"9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f\": container with ID starting with 9e77cd4b2d91fedcc3ed8a50986177cd0a307d0a57fba3d33034b94afee84f2f not found: ID does not exist" Apr 20 20:15:03.558020 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.557983 2575 scope.go:117] "RemoveContainer" containerID="53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514" Apr 20 20:15:03.558170 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:03.558153 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514\": container with ID starting with 53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514 not found: ID does not exist" containerID="53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514" Apr 20 20:15:03.558219 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.558175 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514"} err="failed to get container status \"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514\": rpc error: code = NotFound desc = could not find container \"53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514\": container with ID starting with 53e1c98e67e389c1ac10e8a5be544c661ae9aa6ea0f610bc935106a0a468d514 not found: ID does not exist" Apr 20 20:15:03.566710 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.566683 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:15:03.569472 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:03.569452 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-127f1-predictor-68b975dccb-dcphk"] Apr 20 20:15:04.730391 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:04.730355 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" path="/var/lib/kubelet/pods/e6cb64b5-1bab-48c6-997f-4dddb3884c8d/volumes" Apr 20 20:15:08.547811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:08.547736 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:08.548275 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:08.548227 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:18.548664 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:18.548625 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:28.548474 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:28.548432 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:38.549228 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:38.549185 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 20 20:15:40.319610 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.319550 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:15:40.320042 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.319984 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" containerID="cri-o://0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3" gracePeriod=30 Apr 20 20:15:40.320113 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.320034 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kube-rbac-proxy" containerID="cri-o://e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163" gracePeriod=30 Apr 20 20:15:40.443306 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.443279 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:15:40.443615 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.443566 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" containerID="cri-o://f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3" gracePeriod=30 Apr 20 20:15:40.443615 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.443589 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kube-rbac-proxy" containerID="cri-o://b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7" gracePeriod=30 Apr 20 20:15:40.491443 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491416 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:15:40.491734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491722 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" Apr 20 20:15:40.491778 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491736 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" Apr 20 20:15:40.491814 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491764 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kube-rbac-proxy" Apr 20 20:15:40.491814 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491797 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kube-rbac-proxy" Apr 20 20:15:40.491876 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491853 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kube-rbac-proxy" Apr 20 20:15:40.491876 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.491864 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6cb64b5-1bab-48c6-997f-4dddb3884c8d" containerName="kserve-container" Apr 20 20:15:40.494746 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.494729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.497500 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.497483 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e6f34-predictor-serving-cert\"" Apr 20 20:15:40.497581 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.497487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\"" Apr 20 20:15:40.510858 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.510836 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:15:40.569910 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.569849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.570025 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.569909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.570025 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.569950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcvb\" (UniqueName: \"kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.660152 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.660116 2575 generic.go:358] "Generic (PLEG): container finished" podID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerID="e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163" exitCode=2 Apr 20 20:15:40.660328 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.660190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerDied","Data":"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163"} Apr 20 20:15:40.662339 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.662310 2575 generic.go:358] "Generic (PLEG): container finished" podID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerID="b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7" exitCode=2 Apr 20 20:15:40.662458 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.662347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerDied","Data":"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7"} Apr 20 20:15:40.670827 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.670804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcvb\" (UniqueName: \"kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.670883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.670855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.671052 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.671031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.671707 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.671682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.673324 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.673303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.684642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.684615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcvb\" (UniqueName: \"kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb\") pod \"success-200-isvc-e6f34-predictor-57474bf446-r75x6\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.803713 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.803680 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:40.929712 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.929675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:15:40.930297 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:15:40.930246 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17dfe92_a4ff_4b88_8cfb_aebae882c7fd.slice/crio-0143fab1098dce1016e3244d0a73e0a560672a9e712f147d3cf8ac17dd0d77f4 WatchSource:0}: Error finding container 0143fab1098dce1016e3244d0a73e0a560672a9e712f147d3cf8ac17dd0d77f4: Status 404 returned error can't find the container with id 0143fab1098dce1016e3244d0a73e0a560672a9e712f147d3cf8ac17dd0d77f4 Apr 20 20:15:40.932176 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:40.932155 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:15:41.666890 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:41.666854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerStarted","Data":"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3"} Apr 20 20:15:41.667269 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:41.666895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerStarted","Data":"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb"} Apr 20 20:15:41.667269 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:41.666911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerStarted","Data":"0143fab1098dce1016e3244d0a73e0a560672a9e712f147d3cf8ac17dd0d77f4"} Apr 20 20:15:41.667269 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:41.667027 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:41.688216 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:41.688165 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podStartSLOduration=1.688151288 podStartE2EDuration="1.688151288s" podCreationTimestamp="2026-04-20 20:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:41.687904534 +0000 UTC m=+649.434824604" watchObservedRunningTime="2026-04-20 20:15:41.688151288 +0000 UTC m=+649.435071361" Apr 20 20:15:42.670401 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:42.670369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:42.671607 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:42.671578 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:15:43.348609 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:43.348569 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.17:8643/healthz\": dial tcp 10.132.0.17:8643: connect: connection refused" Apr 20 20:15:43.348787 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:43.348569 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.19:8643/healthz\": dial tcp 10.132.0.19:8643: connect: connection refused" Apr 20 20:15:43.352973 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:43.352952 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 20 20:15:43.353043 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:43.352983 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 20 20:15:43.672833 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:43.672745 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:15:45.078133 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.078085 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:15:45.081009 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.080993 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:15:45.108114 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108094 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls\") pod \"dabf688b-196b-4f63-9154-7b6bb4fc446c\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " Apr 20 20:15:45.108210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108139 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"dabf688b-196b-4f63-9154-7b6bb4fc446c\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " Apr 20 20:15:45.108210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108162 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls\") pod \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " Apr 20 20:15:45.108210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108191 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location\") pod \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " Apr 20 20:15:45.108385 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108247 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72cs\" (UniqueName: \"kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs\") pod \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " Apr 20 20:15:45.108385 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108292 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\" (UID: \"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6\") " Apr 20 20:15:45.108385 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108336 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location\") pod \"dabf688b-196b-4f63-9154-7b6bb4fc446c\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " Apr 20 20:15:45.108621 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "dabf688b-196b-4f63-9154-7b6bb4fc446c" (UID: "dabf688b-196b-4f63-9154-7b6bb4fc446c"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:45.108695 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108615 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" (UID: "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:45.108753 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108695 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwdl\" (UniqueName: \"kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl\") pod \"dabf688b-196b-4f63-9154-7b6bb4fc446c\" (UID: \"dabf688b-196b-4f63-9154-7b6bb4fc446c\") " Apr 20 20:15:45.108753 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108696 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" (UID: "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:45.108951 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108928 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf688b-196b-4f63-9154-7b6bb4fc446c-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.109026 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108955 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kserve-provision-location\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.109026 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.108971 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.109137 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.109056 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dabf688b-196b-4f63-9154-7b6bb4fc446c" (UID: "dabf688b-196b-4f63-9154-7b6bb4fc446c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:45.110537 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.110513 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" (UID: "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:45.110690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.110671 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dabf688b-196b-4f63-9154-7b6bb4fc446c" (UID: "dabf688b-196b-4f63-9154-7b6bb4fc446c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:45.110738 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.110700 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs" (OuterVolumeSpecName: "kube-api-access-f72cs") pod "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" (UID: "d251e1b5-ef06-472e-a8d0-6d5641b1a9e6"). InnerVolumeSpecName "kube-api-access-f72cs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:45.110848 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.110829 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl" (OuterVolumeSpecName: "kube-api-access-fzwdl") pod "dabf688b-196b-4f63-9154-7b6bb4fc446c" (UID: "dabf688b-196b-4f63-9154-7b6bb4fc446c"). InnerVolumeSpecName "kube-api-access-fzwdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:45.209254 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.209224 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f72cs\" (UniqueName: \"kubernetes.io/projected/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-kube-api-access-f72cs\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.209349 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.209273 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf688b-196b-4f63-9154-7b6bb4fc446c-kserve-provision-location\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.209349 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.209283 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzwdl\" (UniqueName: \"kubernetes.io/projected/dabf688b-196b-4f63-9154-7b6bb4fc446c-kube-api-access-fzwdl\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.209349 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.209292 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf688b-196b-4f63-9154-7b6bb4fc446c-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.209349 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.209302 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:15:45.679187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.679154 2575 generic.go:358] "Generic (PLEG): container finished" podID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerID="f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3" exitCode=0 Apr 20 20:15:45.679366 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.679243 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" Apr 20 20:15:45.679366 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.679239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerDied","Data":"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3"} Apr 20 20:15:45.679366 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.679304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29" event={"ID":"d251e1b5-ef06-472e-a8d0-6d5641b1a9e6","Type":"ContainerDied","Data":"b1d7d65ea9abef444d16a900efb8439a0a19b26580de7d5924eea05966ab7c18"} Apr 20 20:15:45.679366 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.679323 2575 scope.go:117] "RemoveContainer" containerID="b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7" Apr 20 20:15:45.681057 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.681032 2575 generic.go:358] "Generic (PLEG): container finished" podID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerID="0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3" exitCode=0 Apr 20 20:15:45.681163 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.681062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerDied","Data":"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3"} Apr 20 20:15:45.681163 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.681096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" event={"ID":"dabf688b-196b-4f63-9154-7b6bb4fc446c","Type":"ContainerDied","Data":"723243ae4740d7f76a03a5819e3104927723f316a016db7b35bbf2525608c7b7"} Apr 20 20:15:45.681163 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.681112 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz" Apr 20 20:15:45.688174 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.688155 2575 scope.go:117] "RemoveContainer" containerID="f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3" Apr 20 20:15:45.695949 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.695934 2575 scope.go:117] "RemoveContainer" containerID="a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c" Apr 20 20:15:45.702399 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.702380 2575 scope.go:117] "RemoveContainer" containerID="b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7" Apr 20 20:15:45.702637 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.702619 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7\": container with ID starting with b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7 not found: ID does not exist" containerID="b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7" Apr 20 20:15:45.702708 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.702650 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7"} err="failed to get container status \"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7\": rpc error: code = NotFound desc = could not find container \"b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7\": container with ID starting with b225a0726a41bfb66adc03e0c5a4ad592c4ed10c048c9d5998a9bb90a39beeb7 not found: ID does not exist" Apr 20 20:15:45.702708 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.702668 2575 scope.go:117] "RemoveContainer" containerID="f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3" Apr 20 20:15:45.702926 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.702906 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3\": container with ID starting with f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3 not found: ID does not exist" containerID="f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3" Apr 20 20:15:45.703018 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.702928 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3"} err="failed to get container status \"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3\": rpc error: code = NotFound desc = could not find container \"f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3\": container with ID starting with f49270ad0de5afcd86f16d1761d5365076b5f8ebcb930c11a265c439377c91a3 not found: ID does not exist" Apr 20 20:15:45.703018 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.702943 2575 scope.go:117] "RemoveContainer" containerID="a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c" Apr 20 20:15:45.703693 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.703285 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c\": container with ID starting with a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c not found: ID does not exist" containerID="a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c" Apr 20 20:15:45.703693 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.703317 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c"} err="failed to get container status \"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c\": rpc error: code = NotFound desc = could not find container \"a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c\": container with ID starting with a8fdf39881cc7f5887364c01806f66ac5d8cf4042dd214002d0c7edb4aef907c not found: ID does not exist" Apr 20 20:15:45.703693 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.703337 2575 scope.go:117] "RemoveContainer" containerID="e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163" Apr 20 20:15:45.705025 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.705007 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:15:45.708231 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.708208 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-7cl29"] Apr 20 20:15:45.711037 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.711012 2575 scope.go:117] "RemoveContainer" containerID="0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3" Apr 20 20:15:45.717698 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.717682 2575 scope.go:117] "RemoveContainer" containerID="c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9" Apr 20 20:15:45.718418 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.718400 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:15:45.722358 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.722340 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-pr4dz"] Apr 20 20:15:45.724676 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.724657 2575 scope.go:117] "RemoveContainer" containerID="e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163" Apr 20 20:15:45.724866 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.724848 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163\": container with ID starting with e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163 not found: ID does not exist" containerID="e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163" Apr 20 20:15:45.724930 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.724875 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163"} err="failed to get container status \"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163\": rpc error: code = NotFound desc = could not find container \"e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163\": container with ID starting with e36a3f4b23ee63c5e4164d5228f11d762b8d0691b61c123d15243bf7e743b163 not found: ID does not exist" Apr 20 20:15:45.724930 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.724895 2575 scope.go:117] "RemoveContainer" containerID="0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3" Apr 20 20:15:45.725151 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.725136 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3\": container with ID starting with 0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3 not found: ID does not exist" containerID="0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3" Apr 20 20:15:45.725192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.725161 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3"} err="failed to get container status \"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3\": rpc error: code = NotFound desc = could not find container \"0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3\": container with ID starting with 0678547f26cf25c69daea0acccc0bad642055f867b48f187c0ab3fff5ee955a3 not found: ID does not exist" Apr 20 20:15:45.725192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.725177 2575 scope.go:117] "RemoveContainer" containerID="c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9" Apr 20 20:15:45.725407 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:15:45.725390 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9\": container with ID starting with c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9 not found: ID does not exist" containerID="c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9" Apr 20 20:15:45.725453 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:45.725411 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9"} err="failed to get container status \"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9\": rpc error: code = NotFound desc = could not find container \"c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9\": container with ID starting with c9d26f7a004b9a1d2afff97be10ed0e57b3de34284f693c4623068c1152573b9 not found: ID does not exist" Apr 20 20:15:46.730374 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:46.730343 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" path="/var/lib/kubelet/pods/d251e1b5-ef06-472e-a8d0-6d5641b1a9e6/volumes" Apr 20 20:15:46.730828 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:46.730812 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" path="/var/lib/kubelet/pods/dabf688b-196b-4f63-9154-7b6bb4fc446c/volumes" Apr 20 20:15:48.548782 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:48.548754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:15:48.677149 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:48.677120 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:15:48.677450 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:48.677425 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:15:58.677790 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:15:58.677750 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:16:08.677867 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:16:08.677827 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:16:18.678474 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:16:18.678432 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 20:16:28.678420 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:16:28.678392 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:19:52.679718 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:19:52.679647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:19:52.682758 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:19:52.682737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:24:15.301797 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.301719 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:24:15.302282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.302088 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" containerID="cri-o://c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120" gracePeriod=30 Apr 20 20:24:15.302282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.302119 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kube-rbac-proxy" containerID="cri-o://6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d" gracePeriod=30 Apr 20 20:24:15.390878 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.390847 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:24:15.391136 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391125 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" Apr 20 20:24:15.391178 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391138 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" Apr 20 20:24:15.391178 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391152 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="storage-initializer" Apr 20 20:24:15.391178 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391158 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="storage-initializer" Apr 20 20:24:15.391178 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391169 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kube-rbac-proxy" Apr 20 20:24:15.391178 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391176 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kube-rbac-proxy" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391182 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="storage-initializer" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391187 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="storage-initializer" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391193 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391199 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391205 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kube-rbac-proxy" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391209 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kube-rbac-proxy" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391263 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kserve-container" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391270 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kube-rbac-proxy" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391278 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dabf688b-196b-4f63-9154-7b6bb4fc446c" containerName="kserve-container" Apr 20 20:24:15.391347 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.391286 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d251e1b5-ef06-472e-a8d0-6d5641b1a9e6" containerName="kube-rbac-proxy" Apr 20 20:24:15.394085 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.394069 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.396667 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.396645 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7541f-predictor-serving-cert\"" Apr 20 20:24:15.396775 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.396645 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7541f-kube-rbac-proxy-sar-config\"" Apr 20 20:24:15.412343 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.412316 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:24:15.515233 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.515203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.515393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.515267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2jd\" (UniqueName: \"kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.515393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.515321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.616632 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.616573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2jd\" (UniqueName: \"kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.616632 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.616605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.616767 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.616668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.616767 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:15.616764 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-7541f-predictor-serving-cert: secret "success-200-isvc-7541f-predictor-serving-cert" not found Apr 20 20:24:15.616861 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:15.616810 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls podName:9596c6c3-0a23-4d76-9f24-b392c23700e0 nodeName:}" failed. No retries permitted until 2026-04-20 20:24:16.116794738 +0000 UTC m=+1163.863714787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls") pod "success-200-isvc-7541f-predictor-85fdf68876-fwrsn" (UID: "9596c6c3-0a23-4d76-9f24-b392c23700e0") : secret "success-200-isvc-7541f-predictor-serving-cert" not found Apr 20 20:24:15.617132 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.617115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:15.627593 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:15.627565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2jd\" (UniqueName: \"kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:16.117460 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.117425 2575 generic.go:358] "Generic (PLEG): container finished" podID="79498564-cc7d-4361-9213-90d899860d1e" containerID="6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d" exitCode=2 Apr 20 20:24:16.117614 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.117495 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerDied","Data":"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d"} Apr 20 20:24:16.121720 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.121702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:16.124022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.124005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") pod \"success-200-isvc-7541f-predictor-85fdf68876-fwrsn\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:16.303630 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.303596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:16.419745 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.419712 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:24:16.422780 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:24:16.422752 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9596c6c3_0a23_4d76_9f24_b392c23700e0.slice/crio-cbfa45843647c56926d00fed0ed33d91a05a6433a9431af9fe6b0a3c08ede325 WatchSource:0}: Error finding container cbfa45843647c56926d00fed0ed33d91a05a6433a9431af9fe6b0a3c08ede325: Status 404 returned error can't find the container with id cbfa45843647c56926d00fed0ed33d91a05a6433a9431af9fe6b0a3c08ede325 Apr 20 20:24:16.424584 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:16.424566 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:24:17.122210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:17.122173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerStarted","Data":"f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649"} Apr 20 20:24:17.122210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:17.122212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerStarted","Data":"d6c6120b324cd390f7639e91ffd27568bba253e7219926e648041dd11095f278"} Apr 20 20:24:17.122475 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:17.122225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerStarted","Data":"cbfa45843647c56926d00fed0ed33d91a05a6433a9431af9fe6b0a3c08ede325"} Apr 20 20:24:17.122475 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:17.122359 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:17.140913 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:17.140867 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podStartSLOduration=2.140852181 podStartE2EDuration="2.140852181s" podCreationTimestamp="2026-04-20 20:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:24:17.139795875 +0000 UTC m=+1164.886715946" watchObservedRunningTime="2026-04-20 20:24:17.140852181 +0000 UTC m=+1164.887772253" Apr 20 20:24:18.046439 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.046417 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:24:18.126968 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.126895 2575 generic.go:358] "Generic (PLEG): container finished" podID="79498564-cc7d-4361-9213-90d899860d1e" containerID="c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120" exitCode=0 Apr 20 20:24:18.127075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.126968 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" Apr 20 20:24:18.127075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.126979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerDied","Data":"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120"} Apr 20 20:24:18.127075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.127014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n" event={"ID":"79498564-cc7d-4361-9213-90d899860d1e","Type":"ContainerDied","Data":"b5156ab293644b7e33a86e22bad754a699930e32d8e0c3054516f91347d234f4"} Apr 20 20:24:18.127075 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.127031 2575 scope.go:117] "RemoveContainer" containerID="6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d" Apr 20 20:24:18.127301 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.127283 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:18.128439 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.128418 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:18.134827 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.134805 2575 scope.go:117] "RemoveContainer" containerID="c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120" Apr 20 20:24:18.137854 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.137836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") pod \"79498564-cc7d-4361-9213-90d899860d1e\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " Apr 20 20:24:18.137931 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.137908 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4spg5\" (UniqueName: \"kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5\") pod \"79498564-cc7d-4361-9213-90d899860d1e\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " Apr 20 20:24:18.137983 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.137934 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"79498564-cc7d-4361-9213-90d899860d1e\" (UID: \"79498564-cc7d-4361-9213-90d899860d1e\") " Apr 20 20:24:18.138427 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.138398 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-9290e-kube-rbac-proxy-sar-config") pod "79498564-cc7d-4361-9213-90d899860d1e" (UID: "79498564-cc7d-4361-9213-90d899860d1e"). InnerVolumeSpecName "success-200-isvc-9290e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:24:18.139824 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.139801 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "79498564-cc7d-4361-9213-90d899860d1e" (UID: "79498564-cc7d-4361-9213-90d899860d1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:24:18.139904 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.139867 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5" (OuterVolumeSpecName: "kube-api-access-4spg5") pod "79498564-cc7d-4361-9213-90d899860d1e" (UID: "79498564-cc7d-4361-9213-90d899860d1e"). InnerVolumeSpecName "kube-api-access-4spg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:24:18.141700 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.141679 2575 scope.go:117] "RemoveContainer" containerID="6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d" Apr 20 20:24:18.141945 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:18.141927 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d\": container with ID starting with 6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d not found: ID does not exist" containerID="6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d" Apr 20 20:24:18.142029 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.141964 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d"} err="failed to get container status \"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d\": rpc error: code = NotFound desc = could not find container \"6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d\": container with ID starting with 6792ab9db1db158daf4c194475e2223d6bd4fc775621b5dd39d18a10d79cec9d not found: ID does not exist" Apr 20 20:24:18.142029 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.141987 2575 scope.go:117] "RemoveContainer" containerID="c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120" Apr 20 20:24:18.142324 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:18.142304 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120\": container with ID starting with c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120 not found: ID does not exist" containerID="c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120" Apr 20 20:24:18.142374 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.142330 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120"} err="failed to get container status \"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120\": rpc error: code = NotFound desc = could not find container \"c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120\": container with ID starting with c41eac106c544acdd7bd962c8bbcb06835a0cbd378aae106e07fafb738104120 not found: ID does not exist" Apr 20 20:24:18.238906 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.238882 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4spg5\" (UniqueName: \"kubernetes.io/projected/79498564-cc7d-4361-9213-90d899860d1e-kube-api-access-4spg5\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.238989 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.238909 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79498564-cc7d-4361-9213-90d899860d1e-success-200-isvc-9290e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.238989 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.238924 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79498564-cc7d-4361-9213-90d899860d1e-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.447340 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.447305 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:24:18.451104 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.451078 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9290e-predictor-5b44755bcb-smm7n"] Apr 20 20:24:18.730897 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:18.730832 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79498564-cc7d-4361-9213-90d899860d1e" path="/var/lib/kubelet/pods/79498564-cc7d-4361-9213-90d899860d1e/volumes" Apr 20 20:24:19.130536 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:19.130505 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:24.135062 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:24.135028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:24:24.135625 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:24.135595 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:34.136350 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:34.136307 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:44.135829 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:44.135789 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:52.698431 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:52.698401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:24:52.701668 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:52.701645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:24:54.135510 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:54.135462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 20 20:24:55.133558 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.133524 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:24:55.133942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.133882 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" containerID="cri-o://cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb" gracePeriod=30 Apr 20 20:24:55.134088 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.133968 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kube-rbac-proxy" containerID="cri-o://b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3" gracePeriod=30 Apr 20 20:24:55.159139 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159115 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:24:55.159417 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159404 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" Apr 20 20:24:55.159464 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159420 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" Apr 20 20:24:55.159464 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159440 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kube-rbac-proxy" Apr 20 20:24:55.159464 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159445 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kube-rbac-proxy" Apr 20 20:24:55.159576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159483 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kserve-container" Apr 20 20:24:55.159576 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.159496 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="79498564-cc7d-4361-9213-90d899860d1e" containerName="kube-rbac-proxy" Apr 20 20:24:55.163641 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.163625 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.165834 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.165815 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-63de8-predictor-serving-cert\"" Apr 20 20:24:55.165944 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.165815 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-63de8-kube-rbac-proxy-sar-config\"" Apr 20 20:24:55.172554 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.172535 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:24:55.178472 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.178452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmqbb\" (UniqueName: \"kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.178566 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.178483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.178566 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.178508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.279548 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.279521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmqbb\" (UniqueName: \"kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.279664 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.279556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.279664 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.279578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.279664 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:55.279661 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-63de8-predictor-serving-cert: secret "success-200-isvc-63de8-predictor-serving-cert" not found Apr 20 20:24:55.279785 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:55.279719 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls podName:967bd600-f4c2-4001-97a3-77b3ce9d9c1c nodeName:}" failed. No retries permitted until 2026-04-20 20:24:55.779701768 +0000 UTC m=+1203.526621820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls") pod "success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" (UID: "967bd600-f4c2-4001-97a3-77b3ce9d9c1c") : secret "success-200-isvc-63de8-predictor-serving-cert" not found Apr 20 20:24:55.280156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.280136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.291240 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.291214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmqbb\" (UniqueName: \"kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.783408 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:55.783364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:55.783611 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:55.783520 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-63de8-predictor-serving-cert: secret "success-200-isvc-63de8-predictor-serving-cert" not found Apr 20 20:24:55.783611 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:55.783596 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls podName:967bd600-f4c2-4001-97a3-77b3ce9d9c1c nodeName:}" failed. No retries permitted until 2026-04-20 20:24:56.783577293 +0000 UTC m=+1204.530497344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls") pod "success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" (UID: "967bd600-f4c2-4001-97a3-77b3ce9d9c1c") : secret "success-200-isvc-63de8-predictor-serving-cert" not found Apr 20 20:24:56.235528 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:56.235436 2575 generic.go:358] "Generic (PLEG): container finished" podID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerID="b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3" exitCode=2 Apr 20 20:24:56.235528 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:56.235479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerDied","Data":"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3"} Apr 20 20:24:56.789623 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:56.789590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:56.791960 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:56.791930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:56.973725 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:56.973689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:57.090498 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.090474 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:24:57.092904 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:24:57.092870 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967bd600_f4c2_4001_97a3_77b3ce9d9c1c.slice/crio-86737457e3eafd8b5524fd73e7c5b652e0997f8de795db157859d249f414f069 WatchSource:0}: Error finding container 86737457e3eafd8b5524fd73e7c5b652e0997f8de795db157859d249f414f069: Status 404 returned error can't find the container with id 86737457e3eafd8b5524fd73e7c5b652e0997f8de795db157859d249f414f069 Apr 20 20:24:57.240587 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.240556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerStarted","Data":"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc"} Apr 20 20:24:57.240587 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.240591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerStarted","Data":"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3"} Apr 20 20:24:57.240929 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.240601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerStarted","Data":"86737457e3eafd8b5524fd73e7c5b652e0997f8de795db157859d249f414f069"} Apr 20 20:24:57.240929 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.240687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:57.258329 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.258284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podStartSLOduration=2.2582703029999998 podStartE2EDuration="2.258270303s" podCreationTimestamp="2026-04-20 20:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:24:57.256161552 +0000 UTC m=+1205.003081624" watchObservedRunningTime="2026-04-20 20:24:57.258270303 +0000 UTC m=+1205.005190371" Apr 20 20:24:57.765032 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.765005 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:24:57.797378 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.797350 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls\") pod \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " Apr 20 20:24:57.797513 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.797427 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " Apr 20 20:24:57.797714 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.797647 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbcvb\" (UniqueName: \"kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb\") pod \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\" (UID: \"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd\") " Apr 20 20:24:57.797819 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.797748 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e6f34-kube-rbac-proxy-sar-config") pod "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" (UID: "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd"). InnerVolumeSpecName "success-200-isvc-e6f34-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:24:57.797940 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.797903 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-success-200-isvc-e6f34-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:57.799546 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.799522 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" (UID: "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:24:57.799812 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.799789 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb" (OuterVolumeSpecName: "kube-api-access-qbcvb") pod "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" (UID: "c17dfe92-a4ff-4b88-8cfb-aebae882c7fd"). InnerVolumeSpecName "kube-api-access-qbcvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:24:57.898621 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.898556 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:57.898621 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:57.898593 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbcvb\" (UniqueName: \"kubernetes.io/projected/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd-kube-api-access-qbcvb\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:24:58.244852 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.244765 2575 generic.go:358] "Generic (PLEG): container finished" podID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerID="cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb" exitCode=0 Apr 20 20:24:58.245297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.244854 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" Apr 20 20:24:58.245297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.244856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerDied","Data":"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb"} Apr 20 20:24:58.245297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.244896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6" event={"ID":"c17dfe92-a4ff-4b88-8cfb-aebae882c7fd","Type":"ContainerDied","Data":"0143fab1098dce1016e3244d0a73e0a560672a9e712f147d3cf8ac17dd0d77f4"} Apr 20 20:24:58.245297 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.244919 2575 scope.go:117] "RemoveContainer" containerID="b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3" Apr 20 20:24:58.245539 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.245419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:24:58.246922 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.246897 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:24:58.252953 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.252723 2575 scope.go:117] "RemoveContainer" containerID="cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb" Apr 20 20:24:58.259403 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.259384 2575 scope.go:117] "RemoveContainer" containerID="b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3" Apr 20 20:24:58.259639 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:58.259624 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3\": container with ID starting with b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3 not found: ID does not exist" containerID="b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3" Apr 20 20:24:58.259681 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.259645 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3"} err="failed to get container status \"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3\": rpc error: code = NotFound desc = could not find container \"b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3\": container with ID starting with b8e6a1936d5ac6eed3dc8d09bb8d6c4b14f085dfd465b2cdd70bfab5e3872be3 not found: ID does not exist" Apr 20 20:24:58.259681 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.259661 2575 scope.go:117] "RemoveContainer" containerID="cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb" Apr 20 20:24:58.259888 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:24:58.259871 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb\": container with ID starting with cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb not found: ID does not exist" containerID="cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb" Apr 20 20:24:58.259984 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.259894 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb"} err="failed to get container status \"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb\": rpc error: code = NotFound desc = could not find container \"cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb\": container with ID starting with cbb36f533f5c6788c912024e4a9afb41b34f68ea4aa5a1474dd7876578a01efb not found: ID does not exist" Apr 20 20:24:58.264850 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.264829 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:24:58.267837 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.267816 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e6f34-predictor-57474bf446-r75x6"] Apr 20 20:24:58.732611 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:58.732582 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" path="/var/lib/kubelet/pods/c17dfe92-a4ff-4b88-8cfb-aebae882c7fd/volumes" Apr 20 20:24:59.248991 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:24:59.248959 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:25:04.136320 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:04.136290 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:25:04.253740 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:04.253712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:25:04.254156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:04.254132 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:25:14.254204 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:14.254160 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:25:24.255052 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:24.255008 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:25:25.643489 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.643454 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:25:25.643892 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.643802 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" containerID="cri-o://d6c6120b324cd390f7639e91ffd27568bba253e7219926e648041dd11095f278" gracePeriod=30 Apr 20 20:25:25.643950 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.643868 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kube-rbac-proxy" containerID="cri-o://f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649" gracePeriod=30 Apr 20 20:25:25.661025 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:25:25.661000 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9596c6c3_0a23_4d76_9f24_b392c23700e0.slice/crio-f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:25:25.672640 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672616 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:25:25.672882 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672870 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" Apr 20 20:25:25.672882 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672883 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" Apr 20 20:25:25.673022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672892 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kube-rbac-proxy" Apr 20 20:25:25.673022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672901 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kube-rbac-proxy" Apr 20 20:25:25.673022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672958 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kserve-container" Apr 20 20:25:25.673022 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.672968 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c17dfe92-a4ff-4b88-8cfb-aebae882c7fd" containerName="kube-rbac-proxy" Apr 20 20:25:25.675938 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.675922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.678214 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.678193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-38d71-predictor-serving-cert\"" Apr 20 20:25:25.678311 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.678241 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-38d71-kube-rbac-proxy-sar-config\"" Apr 20 20:25:25.684767 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.684732 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:25:25.768583 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.768555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.768702 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.768601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcnbc\" (UniqueName: \"kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.768702 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.768638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.869194 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.869166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.869318 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.869217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcnbc\" (UniqueName: \"kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.869318 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.869281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.869448 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:25:25.869330 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-38d71-predictor-serving-cert: secret "success-200-isvc-38d71-predictor-serving-cert" not found Apr 20 20:25:25.869448 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:25:25.869392 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls podName:e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:25:26.36937328 +0000 UTC m=+1234.116293333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls") pod "success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" (UID: "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8") : secret "success-200-isvc-38d71-predictor-serving-cert" not found Apr 20 20:25:25.869894 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.869871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:25.878143 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:25.878118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcnbc\" (UniqueName: \"kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:26.323549 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.323516 2575 generic.go:358] "Generic (PLEG): container finished" podID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerID="f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649" exitCode=2 Apr 20 20:25:26.323709 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.323554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerDied","Data":"f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649"} Apr 20 20:25:26.372635 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.372606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:26.374911 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.374890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") pod \"success-200-isvc-38d71-predictor-c6f86d4b4-w2stk\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:26.586750 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.586651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:26.714111 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:26.713807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:25:26.716393 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:25:26.716366 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37a9c1b_d07a_4b7c_8cf3_d03f4a9040a8.slice/crio-6881e23389450554d85883dbef2650813e145f8a5868c5f05480cd44a12132f9 WatchSource:0}: Error finding container 6881e23389450554d85883dbef2650813e145f8a5868c5f05480cd44a12132f9: Status 404 returned error can't find the container with id 6881e23389450554d85883dbef2650813e145f8a5868c5f05480cd44a12132f9 Apr 20 20:25:27.328626 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:27.328580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerStarted","Data":"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1"} Apr 20 20:25:27.328626 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:27.328627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerStarted","Data":"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598"} Apr 20 20:25:27.328852 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:27.328643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerStarted","Data":"6881e23389450554d85883dbef2650813e145f8a5868c5f05480cd44a12132f9"} Apr 20 20:25:27.328852 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:27.328746 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:27.346735 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:27.346690 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podStartSLOduration=2.346675112 podStartE2EDuration="2.346675112s" podCreationTimestamp="2026-04-20 20:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:25:27.344599899 +0000 UTC m=+1235.091519970" watchObservedRunningTime="2026-04-20 20:25:27.346675112 +0000 UTC m=+1235.093595174" Apr 20 20:25:28.334055 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.334018 2575 generic.go:358] "Generic (PLEG): container finished" podID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerID="d6c6120b324cd390f7639e91ffd27568bba253e7219926e648041dd11095f278" exitCode=0 Apr 20 20:25:28.334450 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.334056 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerDied","Data":"d6c6120b324cd390f7639e91ffd27568bba253e7219926e648041dd11095f278"} Apr 20 20:25:28.334450 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.334361 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:28.335638 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.335607 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:25:28.374400 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.374381 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:25:28.389241 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.389222 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw2jd\" (UniqueName: \"kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd\") pod \"9596c6c3-0a23-4d76-9f24-b392c23700e0\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " Apr 20 20:25:28.389334 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.389266 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"9596c6c3-0a23-4d76-9f24-b392c23700e0\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " Apr 20 20:25:28.389377 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.389361 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") pod \"9596c6c3-0a23-4d76-9f24-b392c23700e0\" (UID: \"9596c6c3-0a23-4d76-9f24-b392c23700e0\") " Apr 20 20:25:28.389810 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.389652 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-7541f-kube-rbac-proxy-sar-config") pod "9596c6c3-0a23-4d76-9f24-b392c23700e0" (UID: "9596c6c3-0a23-4d76-9f24-b392c23700e0"). InnerVolumeSpecName "success-200-isvc-7541f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:28.391745 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.391641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9596c6c3-0a23-4d76-9f24-b392c23700e0" (UID: "9596c6c3-0a23-4d76-9f24-b392c23700e0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:28.391745 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.391730 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd" (OuterVolumeSpecName: "kube-api-access-vw2jd") pod "9596c6c3-0a23-4d76-9f24-b392c23700e0" (UID: "9596c6c3-0a23-4d76-9f24-b392c23700e0"). InnerVolumeSpecName "kube-api-access-vw2jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:25:28.490816 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.490766 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vw2jd\" (UniqueName: \"kubernetes.io/projected/9596c6c3-0a23-4d76-9f24-b392c23700e0-kube-api-access-vw2jd\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:25:28.490816 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.490786 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9596c6c3-0a23-4d76-9f24-b392c23700e0-success-200-isvc-7541f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:25:28.490816 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:28.490799 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9596c6c3-0a23-4d76-9f24-b392c23700e0-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:25:29.338946 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.338902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" event={"ID":"9596c6c3-0a23-4d76-9f24-b392c23700e0","Type":"ContainerDied","Data":"cbfa45843647c56926d00fed0ed33d91a05a6433a9431af9fe6b0a3c08ede325"} Apr 20 20:25:29.339421 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.338967 2575 scope.go:117] "RemoveContainer" containerID="f7e58abcaf4c40e96674d311f14baa3d31546383727ebb1d2c009ccb71d0b649" Apr 20 20:25:29.339421 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.338974 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn" Apr 20 20:25:29.339703 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.339676 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:25:29.346917 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.346904 2575 scope.go:117] "RemoveContainer" containerID="d6c6120b324cd390f7639e91ffd27568bba253e7219926e648041dd11095f278" Apr 20 20:25:29.354643 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.354623 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:25:29.357982 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:29.357964 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7541f-predictor-85fdf68876-fwrsn"] Apr 20 20:25:30.731666 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:30.731633 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" path="/var/lib/kubelet/pods/9596c6c3-0a23-4d76-9f24-b392c23700e0/volumes" Apr 20 20:25:34.254207 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:34.254167 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 20 20:25:34.343263 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:34.343217 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:25:34.343741 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:34.343714 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:25:44.254454 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:44.254372 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:25:44.344644 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:44.344614 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:25:54.343691 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:25:54.343634 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:26:04.343737 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:04.343699 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 20 20:26:05.378922 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.378890 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:26:05.379324 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.379172 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" containerID="cri-o://f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3" gracePeriod=30 Apr 20 20:26:05.379324 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.379202 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kube-rbac-proxy" containerID="cri-o://b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc" gracePeriod=30 Apr 20 20:26:05.403955 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.403933 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:26:05.404192 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" Apr 20 20:26:05.404232 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404194 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" Apr 20 20:26:05.404232 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404212 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kube-rbac-proxy" Apr 20 20:26:05.404232 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404218 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kube-rbac-proxy" Apr 20 20:26:05.404351 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404279 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kserve-container" Apr 20 20:26:05.404351 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.404286 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9596c6c3-0a23-4d76-9f24-b392c23700e0" containerName="kube-rbac-proxy" Apr 20 20:26:05.408637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.408621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.410847 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.410825 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fe410-predictor-serving-cert\"" Apr 20 20:26:05.411014 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.410901 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fe410-kube-rbac-proxy-sar-config\"" Apr 20 20:26:05.418816 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.418190 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:26:05.543552 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.543524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdc4p\" (UniqueName: \"kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.543690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.543564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.543690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.543595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.644133 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.644060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdc4p\" (UniqueName: \"kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.644133 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.644107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.644343 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.644143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.644836 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.644816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.646509 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.646492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.652039 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.652018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdc4p\" (UniqueName: \"kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p\") pod \"success-200-isvc-fe410-predictor-66c5568bb-9sm5z\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.720477 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.720455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:05.835971 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:05.835936 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:26:05.838864 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:26:05.838838 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d11f25d_fee7_4da2_84fa_2de26a1f4cb5.slice/crio-e261fefa74bbddbf7db7b43c5f31df2815bf4b9563ce79e2517416d08cea434c WatchSource:0}: Error finding container e261fefa74bbddbf7db7b43c5f31df2815bf4b9563ce79e2517416d08cea434c: Status 404 returned error can't find the container with id e261fefa74bbddbf7db7b43c5f31df2815bf4b9563ce79e2517416d08cea434c Apr 20 20:26:06.448485 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.448452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerStarted","Data":"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc"} Apr 20 20:26:06.448485 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.448487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerStarted","Data":"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301"} Apr 20 20:26:06.448965 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.448497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerStarted","Data":"e261fefa74bbddbf7db7b43c5f31df2815bf4b9563ce79e2517416d08cea434c"} Apr 20 20:26:06.448965 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.448602 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:06.450074 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.450052 2575 generic.go:358] "Generic (PLEG): container finished" podID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerID="b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc" exitCode=2 Apr 20 20:26:06.450158 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.450108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerDied","Data":"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc"} Apr 20 20:26:06.465377 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:06.465335 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podStartSLOduration=1.4653209569999999 podStartE2EDuration="1.465320957s" podCreationTimestamp="2026-04-20 20:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:26:06.464410855 +0000 UTC m=+1274.211330938" watchObservedRunningTime="2026-04-20 20:26:06.465320957 +0000 UTC m=+1274.212241005" Apr 20 20:26:07.452974 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:07.452942 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:07.454349 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:07.454316 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:08.318336 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.318313 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:26:08.364703 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.364677 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmqbb\" (UniqueName: \"kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb\") pod \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " Apr 20 20:26:08.364797 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.364751 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " Apr 20 20:26:08.364797 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.364791 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") pod \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\" (UID: \"967bd600-f4c2-4001-97a3-77b3ce9d9c1c\") " Apr 20 20:26:08.365059 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.365033 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-63de8-kube-rbac-proxy-sar-config") pod "967bd600-f4c2-4001-97a3-77b3ce9d9c1c" (UID: "967bd600-f4c2-4001-97a3-77b3ce9d9c1c"). InnerVolumeSpecName "success-200-isvc-63de8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:26:08.366718 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.366698 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "967bd600-f4c2-4001-97a3-77b3ce9d9c1c" (UID: "967bd600-f4c2-4001-97a3-77b3ce9d9c1c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:26:08.366786 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.366732 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb" (OuterVolumeSpecName: "kube-api-access-tmqbb") pod "967bd600-f4c2-4001-97a3-77b3ce9d9c1c" (UID: "967bd600-f4c2-4001-97a3-77b3ce9d9c1c"). InnerVolumeSpecName "kube-api-access-tmqbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:26:08.456709 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.456645 2575 generic.go:358] "Generic (PLEG): container finished" podID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerID="f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3" exitCode=0 Apr 20 20:26:08.457071 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.456726 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" Apr 20 20:26:08.457071 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.456740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerDied","Data":"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3"} Apr 20 20:26:08.457071 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.456793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s" event={"ID":"967bd600-f4c2-4001-97a3-77b3ce9d9c1c","Type":"ContainerDied","Data":"86737457e3eafd8b5524fd73e7c5b652e0997f8de795db157859d249f414f069"} Apr 20 20:26:08.457071 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.456820 2575 scope.go:117] "RemoveContainer" containerID="b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc" Apr 20 20:26:08.457314 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.457175 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:08.465301 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.465277 2575 scope.go:117] "RemoveContainer" containerID="f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3" Apr 20 20:26:08.465395 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.465303 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmqbb\" (UniqueName: \"kubernetes.io/projected/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-kube-api-access-tmqbb\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:26:08.465395 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.465324 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-success-200-isvc-63de8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:26:08.465395 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.465340 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/967bd600-f4c2-4001-97a3-77b3ce9d9c1c-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:26:08.471889 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.471873 2575 scope.go:117] "RemoveContainer" containerID="b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc" Apr 20 20:26:08.472118 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:26:08.472099 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc\": container with ID starting with b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc not found: ID does not exist" containerID="b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc" Apr 20 20:26:08.472166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.472125 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc"} err="failed to get container status \"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc\": rpc error: code = NotFound desc = could not find container \"b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc\": container with ID starting with b44bcdcb78025ebaf43b43578bb7056d717d1ce34c402f28a0dda2d9dadc48fc not found: ID does not exist" Apr 20 20:26:08.472166 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.472139 2575 scope.go:117] "RemoveContainer" containerID="f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3" Apr 20 20:26:08.472461 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:26:08.472443 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3\": container with ID starting with f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3 not found: ID does not exist" containerID="f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3" Apr 20 20:26:08.472522 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.472467 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3"} err="failed to get container status \"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3\": rpc error: code = NotFound desc = could not find container \"f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3\": container with ID starting with f0e9bb9e4fe72478b0dccc4c489e9c4ab1752837f21b142f6fe01e87f6df86d3 not found: ID does not exist" Apr 20 20:26:08.479020 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.478998 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:26:08.482690 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.482662 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-63de8-predictor-7d6b9c5fd-p9j7s"] Apr 20 20:26:08.731530 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:08.731461 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" path="/var/lib/kubelet/pods/967bd600-f4c2-4001-97a3-77b3ce9d9c1c/volumes" Apr 20 20:26:13.461600 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:13.461567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:26:13.463902 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:13.462039 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:14.344322 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:14.344294 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:26:23.462804 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:23.462764 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:33.462621 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:33.462582 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:43.462845 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:43.462804 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 20 20:26:53.463027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:26:53.462998 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:29:52.719343 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:29:52.719315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:29:52.722623 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:29:52.722602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:34:40.462017 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.461937 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:34:40.462540 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.462262 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" containerID="cri-o://2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598" gracePeriod=30 Apr 20 20:34:40.462540 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.462312 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kube-rbac-proxy" containerID="cri-o://de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1" gracePeriod=30 Apr 20 20:34:40.542235 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542208 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:34:40.542535 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kube-rbac-proxy" Apr 20 20:34:40.542584 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542539 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kube-rbac-proxy" Apr 20 20:34:40.542584 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542577 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" Apr 20 20:34:40.542656 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542587 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" Apr 20 20:34:40.542703 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542691 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kube-rbac-proxy" Apr 20 20:34:40.542740 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.542707 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="967bd600-f4c2-4001-97a3-77b3ce9d9c1c" containerName="kserve-container" Apr 20 20:34:40.545699 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.545684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.548106 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.548068 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ab6b8-predictor-serving-cert\"" Apr 20 20:34:40.548238 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.548073 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\"" Apr 20 20:34:40.564521 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.564497 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:34:40.579491 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.579467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6wh\" (UniqueName: \"kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.579606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.579516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.579606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.579580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.680899 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.680863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6wh\" (UniqueName: \"kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.680899 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.680907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.681040 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.681025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.681653 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.681634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.683226 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.683210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.689162 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.689137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6wh\" (UniqueName: \"kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh\") pod \"success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.857962 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.857937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:40.908059 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.908028 2575 generic.go:358] "Generic (PLEG): container finished" podID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerID="de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1" exitCode=2 Apr 20 20:34:40.908215 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.908086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerDied","Data":"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1"} Apr 20 20:34:40.974514 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.974429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:34:40.977002 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:34:40.976971 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2270c01b_2bf1_48db_a32d_6f73b6d0dd32.slice/crio-f7674902568fe954d10d97b752185963ead0caea75159cac26077d0057dce6b9 WatchSource:0}: Error finding container f7674902568fe954d10d97b752185963ead0caea75159cac26077d0057dce6b9: Status 404 returned error can't find the container with id f7674902568fe954d10d97b752185963ead0caea75159cac26077d0057dce6b9 Apr 20 20:34:40.978618 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:40.978600 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:34:41.912173 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.912127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerStarted","Data":"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7"} Apr 20 20:34:41.912173 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.912168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerStarted","Data":"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42"} Apr 20 20:34:41.912606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.912184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerStarted","Data":"f7674902568fe954d10d97b752185963ead0caea75159cac26077d0057dce6b9"} Apr 20 20:34:41.912606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.912373 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:41.912606 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.912400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:41.913545 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.913517 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:34:41.933137 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:41.933091 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podStartSLOduration=1.933077388 podStartE2EDuration="1.933077388s" podCreationTimestamp="2026-04-20 20:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:34:41.931311796 +0000 UTC m=+1789.678231867" watchObservedRunningTime="2026-04-20 20:34:41.933077388 +0000 UTC m=+1789.679997459" Apr 20 20:34:42.914560 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:42.914517 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:34:43.302210 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.302187 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:34:43.401722 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.401696 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " Apr 20 20:34:43.401841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.401745 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") pod \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " Apr 20 20:34:43.401841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.401769 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcnbc\" (UniqueName: \"kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc\") pod \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\" (UID: \"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8\") " Apr 20 20:34:43.402026 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.402002 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-38d71-kube-rbac-proxy-sar-config") pod "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" (UID: "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8"). InnerVolumeSpecName "success-200-isvc-38d71-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:34:43.403870 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.403848 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" (UID: "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:34:43.403956 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.403909 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc" (OuterVolumeSpecName: "kube-api-access-kcnbc") pod "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" (UID: "e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8"). InnerVolumeSpecName "kube-api-access-kcnbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:34:43.502920 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.502903 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-success-200-isvc-38d71-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:34:43.503018 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.502922 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:34:43.503018 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.502933 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcnbc\" (UniqueName: \"kubernetes.io/projected/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8-kube-api-access-kcnbc\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:34:43.918812 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.918731 2575 generic.go:358] "Generic (PLEG): container finished" podID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerID="2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598" exitCode=0 Apr 20 20:34:43.918812 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.918796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerDied","Data":"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598"} Apr 20 20:34:43.919233 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.918819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" event={"ID":"e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8","Type":"ContainerDied","Data":"6881e23389450554d85883dbef2650813e145f8a5868c5f05480cd44a12132f9"} Apr 20 20:34:43.919233 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.918834 2575 scope.go:117] "RemoveContainer" containerID="de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1" Apr 20 20:34:43.919233 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.918797 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk" Apr 20 20:34:43.926960 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.926938 2575 scope.go:117] "RemoveContainer" containerID="2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598" Apr 20 20:34:43.933514 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.933496 2575 scope.go:117] "RemoveContainer" containerID="de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1" Apr 20 20:34:43.933761 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:34:43.933740 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1\": container with ID starting with de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1 not found: ID does not exist" containerID="de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1" Apr 20 20:34:43.933841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.933767 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1"} err="failed to get container status \"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1\": rpc error: code = NotFound desc = could not find container \"de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1\": container with ID starting with de6f0f3e68d7b1e65500a66033d23124fdefc74ea142673947fa09198c69c2a1 not found: ID does not exist" Apr 20 20:34:43.933841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.933784 2575 scope.go:117] "RemoveContainer" containerID="2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598" Apr 20 20:34:43.934004 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:34:43.933991 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598\": container with ID starting with 2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598 not found: ID does not exist" containerID="2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598" Apr 20 20:34:43.934054 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.934007 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598"} err="failed to get container status \"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598\": rpc error: code = NotFound desc = could not find container \"2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598\": container with ID starting with 2339087af217d7e67c87294aa3b7eb282bd65a61f0affcabe164139cd1df3598 not found: ID does not exist" Apr 20 20:34:43.939027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.939004 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:34:43.942435 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:43.942416 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-38d71-predictor-c6f86d4b4-w2stk"] Apr 20 20:34:44.731131 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:44.731097 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" path="/var/lib/kubelet/pods/e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8/volumes" Apr 20 20:34:47.918479 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:47.918452 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:34:47.918934 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:47.918903 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:34:52.738089 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:52.738058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:34:52.742462 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:52.742439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:34:57.919059 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:34:57.919008 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:35:07.919468 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:07.919421 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:35:17.919239 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:17.919194 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 20 20:35:20.306267 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.306224 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:35:20.306687 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.306499 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" containerID="cri-o://f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301" gracePeriod=30 Apr 20 20:35:20.306687 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.306567 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kube-rbac-proxy" containerID="cri-o://a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc" gracePeriod=30 Apr 20 20:35:20.336851 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.336829 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:35:20.337107 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337096 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" Apr 20 20:35:20.337156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337109 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" Apr 20 20:35:20.337156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337131 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kube-rbac-proxy" Apr 20 20:35:20.337156 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337136 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kube-rbac-proxy" Apr 20 20:35:20.337282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337176 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kserve-container" Apr 20 20:35:20.337282 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.337187 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e37a9c1b-d07a-4b7c-8cf3-d03f4a9040a8" containerName="kube-rbac-proxy" Apr 20 20:35:20.340069 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.340048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.342575 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.342550 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-60a5c-predictor-serving-cert\"" Apr 20 20:35:20.343549 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.343490 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\"" Apr 20 20:35:20.348444 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.348400 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:35:20.447833 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.447807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f989c\" (UniqueName: \"kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.447959 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.447857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.447959 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.447938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.548735 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.548707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.548854 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.548757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.548854 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.548796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f989c\" (UniqueName: \"kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.548924 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:20.548867 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-serving-cert: secret "success-200-isvc-60a5c-predictor-serving-cert" not found Apr 20 20:35:20.548976 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:20.548934 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls podName:103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6 nodeName:}" failed. No retries permitted until 2026-04-20 20:35:21.048913254 +0000 UTC m=+1828.795833314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls") pod "success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" (UID: "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6") : secret "success-200-isvc-60a5c-predictor-serving-cert" not found Apr 20 20:35:20.549403 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.549381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:20.557335 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:20.557277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f989c\" (UniqueName: \"kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:21.024913 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.024879 2575 generic.go:358] "Generic (PLEG): container finished" podID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerID="a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc" exitCode=2 Apr 20 20:35:21.025066 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.024950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerDied","Data":"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc"} Apr 20 20:35:21.052270 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.052224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:21.054543 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.054527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") pod \"success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:21.253622 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.253584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:21.370736 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:21.370701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:35:21.373075 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:35:21.373050 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod103ac5eb_fe4a_4dd4_9fda_e59c1d9263d6.slice/crio-3e54fa7de42ed5f3d7c90661ffdd71664b251707d7f8580026c4b50068a7a324 WatchSource:0}: Error finding container 3e54fa7de42ed5f3d7c90661ffdd71664b251707d7f8580026c4b50068a7a324: Status 404 returned error can't find the container with id 3e54fa7de42ed5f3d7c90661ffdd71664b251707d7f8580026c4b50068a7a324 Apr 20 20:35:22.029822 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.029785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerStarted","Data":"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258"} Apr 20 20:35:22.029822 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.029821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerStarted","Data":"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b"} Apr 20 20:35:22.029822 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.029832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerStarted","Data":"3e54fa7de42ed5f3d7c90661ffdd71664b251707d7f8580026c4b50068a7a324"} Apr 20 20:35:22.030128 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.030052 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:22.030239 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.030214 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:22.031525 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.031501 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:22.046094 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:22.046053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podStartSLOduration=2.046040163 podStartE2EDuration="2.046040163s" podCreationTimestamp="2026-04-20 20:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:35:22.044294279 +0000 UTC m=+1829.791214353" watchObservedRunningTime="2026-04-20 20:35:22.046040163 +0000 UTC m=+1829.792960231" Apr 20 20:35:23.032610 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.032573 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:23.345955 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.345935 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:35:23.473072 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.473044 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " Apr 20 20:35:23.473191 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.473088 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls\") pod \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " Apr 20 20:35:23.473191 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.473114 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdc4p\" (UniqueName: \"kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p\") pod \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\" (UID: \"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5\") " Apr 20 20:35:23.473416 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.473392 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-fe410-kube-rbac-proxy-sar-config") pod "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" (UID: "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5"). InnerVolumeSpecName "success-200-isvc-fe410-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:23.475046 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.475026 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" (UID: "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:23.475138 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.475117 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p" (OuterVolumeSpecName: "kube-api-access-fdc4p") pod "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" (UID: "1d11f25d-fee7-4da2-84fa-2de26a1f4cb5"). InnerVolumeSpecName "kube-api-access-fdc4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:35:23.573718 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.573690 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:23.573718 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.573715 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdc4p\" (UniqueName: \"kubernetes.io/projected/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-kube-api-access-fdc4p\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:23.573841 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:23.573725 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5-success-200-isvc-fe410-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:24.037238 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.037202 2575 generic.go:358] "Generic (PLEG): container finished" podID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerID="f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301" exitCode=0 Apr 20 20:35:24.037642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.037283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerDied","Data":"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301"} Apr 20 20:35:24.037642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.037320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" event={"ID":"1d11f25d-fee7-4da2-84fa-2de26a1f4cb5","Type":"ContainerDied","Data":"e261fefa74bbddbf7db7b43c5f31df2815bf4b9563ce79e2517416d08cea434c"} Apr 20 20:35:24.037642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.037321 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z" Apr 20 20:35:24.037642 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.037334 2575 scope.go:117] "RemoveContainer" containerID="a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc" Apr 20 20:35:24.045688 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.045672 2575 scope.go:117] "RemoveContainer" containerID="f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301" Apr 20 20:35:24.052461 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.052443 2575 scope.go:117] "RemoveContainer" containerID="a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc" Apr 20 20:35:24.052686 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:24.052668 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc\": container with ID starting with a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc not found: ID does not exist" containerID="a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc" Apr 20 20:35:24.052741 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.052695 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc"} err="failed to get container status \"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc\": rpc error: code = NotFound desc = could not find container \"a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc\": container with ID starting with a324cc89ba55a598850fa9ec55a44e8f16dd3bad018ddafa94a62159418597bc not found: ID does not exist" Apr 20 20:35:24.052741 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.052711 2575 scope.go:117] "RemoveContainer" containerID="f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301" Apr 20 20:35:24.052920 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:24.052903 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301\": container with ID starting with f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301 not found: ID does not exist" containerID="f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301" Apr 20 20:35:24.052969 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.052930 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301"} err="failed to get container status \"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301\": rpc error: code = NotFound desc = could not find container \"f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301\": container with ID starting with f2aec43a48857f841a446cfffda20267f0774e968d52ee01d7b2bc44699af301 not found: ID does not exist" Apr 20 20:35:24.058654 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.058634 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:35:24.061581 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.061562 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fe410-predictor-66c5568bb-9sm5z"] Apr 20 20:35:24.730965 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:24.730925 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" path="/var/lib/kubelet/pods/1d11f25d-fee7-4da2-84fa-2de26a1f4cb5/volumes" Apr 20 20:35:27.920189 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:27.920162 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:35:28.036415 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:28.036389 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:35:28.036791 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:28.036768 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:38.036915 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:38.036874 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:48.037274 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:48.037223 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:50.797732 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.797694 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:35:50.798211 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.798070 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" containerID="cri-o://79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42" gracePeriod=30 Apr 20 20:35:50.798211 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.798111 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kube-rbac-proxy" containerID="cri-o://79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7" gracePeriod=30 Apr 20 20:35:50.825239 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825216 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:35:50.825497 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825484 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" Apr 20 20:35:50.825538 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825499 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" Apr 20 20:35:50.825538 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825517 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kube-rbac-proxy" Apr 20 20:35:50.825538 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825523 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kube-rbac-proxy" Apr 20 20:35:50.825626 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825568 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kserve-container" Apr 20 20:35:50.825626 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.825576 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d11f25d-fee7-4da2-84fa-2de26a1f4cb5" containerName="kube-rbac-proxy" Apr 20 20:35:50.828534 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.828510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:50.830947 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.830929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-57557-predictor-serving-cert\"" Apr 20 20:35:50.830947 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.830944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-57557-kube-rbac-proxy-sar-config\"" Apr 20 20:35:50.837398 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.837378 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:35:50.957281 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.957238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbwl\" (UniqueName: \"kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:50.957386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.957299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:50.957386 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:50.957318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.057772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.057711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbwl\" (UniqueName: \"kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.057772 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.057754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.057930 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.057772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.057930 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:51.057856 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-57557-predictor-serving-cert: secret "success-200-isvc-57557-predictor-serving-cert" not found Apr 20 20:35:51.057930 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:51.057913 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls podName:bf768051-677e-4649-bf94-78323dfd123e nodeName:}" failed. No retries permitted until 2026-04-20 20:35:51.557893775 +0000 UTC m=+1859.304813824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls") pod "success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" (UID: "bf768051-677e-4649-bf94-78323dfd123e") : secret "success-200-isvc-57557-predictor-serving-cert" not found Apr 20 20:35:51.058370 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.058352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.066243 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.066219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbwl\" (UniqueName: \"kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.116801 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.116775 2575 generic.go:358] "Generic (PLEG): container finished" podID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerID="79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7" exitCode=2 Apr 20 20:35:51.116923 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.116838 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerDied","Data":"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7"} Apr 20 20:35:51.562522 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.562488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.564809 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.564790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") pod \"success-200-isvc-57557-predictor-7cdcf47b7b-75nzd\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.738844 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.738815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:51.851530 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:51.851463 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:35:51.854610 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:35:51.854581 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf768051_677e_4649_bf94_78323dfd123e.slice/crio-7c90bf571262f2dbea119b21e25aa6f52c292fd4c673fc568b7b14f69b026cc2 WatchSource:0}: Error finding container 7c90bf571262f2dbea119b21e25aa6f52c292fd4c673fc568b7b14f69b026cc2: Status 404 returned error can't find the container with id 7c90bf571262f2dbea119b21e25aa6f52c292fd4c673fc568b7b14f69b026cc2 Apr 20 20:35:52.121172 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.121086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerStarted","Data":"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333"} Apr 20 20:35:52.121172 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.121130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerStarted","Data":"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59"} Apr 20 20:35:52.121172 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.121147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerStarted","Data":"7c90bf571262f2dbea119b21e25aa6f52c292fd4c673fc568b7b14f69b026cc2"} Apr 20 20:35:52.121387 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.121350 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:52.121500 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.121477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:52.122715 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.122688 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:35:52.137614 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.137562 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podStartSLOduration=2.137547289 podStartE2EDuration="2.137547289s" podCreationTimestamp="2026-04-20 20:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:35:52.136468897 +0000 UTC m=+1859.883388978" watchObservedRunningTime="2026-04-20 20:35:52.137547289 +0000 UTC m=+1859.884467360" Apr 20 20:35:52.914722 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:52.914682 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 20 20:35:53.124802 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:53.124707 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:35:54.031348 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.031324 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:35:54.128077 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.127997 2575 generic.go:358] "Generic (PLEG): container finished" podID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerID="79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42" exitCode=0 Apr 20 20:35:54.128215 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.128075 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" Apr 20 20:35:54.128215 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.128075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerDied","Data":"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42"} Apr 20 20:35:54.128215 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.128119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k" event={"ID":"2270c01b-2bf1-48db-a32d-6f73b6d0dd32","Type":"ContainerDied","Data":"f7674902568fe954d10d97b752185963ead0caea75159cac26077d0057dce6b9"} Apr 20 20:35:54.128215 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.128146 2575 scope.go:117] "RemoveContainer" containerID="79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7" Apr 20 20:35:54.135699 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.135676 2575 scope.go:117] "RemoveContainer" containerID="79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42" Apr 20 20:35:54.142287 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.142271 2575 scope.go:117] "RemoveContainer" containerID="79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7" Apr 20 20:35:54.142536 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:54.142507 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7\": container with ID starting with 79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7 not found: ID does not exist" containerID="79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7" Apr 20 20:35:54.142580 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.142537 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7"} err="failed to get container status \"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7\": rpc error: code = NotFound desc = could not find container \"79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7\": container with ID starting with 79e1e37e8b0a3a53e2f6bf8a0b15d6ca595e5721e8f5bcb7c137fdec2129abd7 not found: ID does not exist" Apr 20 20:35:54.142580 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.142558 2575 scope.go:117] "RemoveContainer" containerID="79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42" Apr 20 20:35:54.142801 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:35:54.142784 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42\": container with ID starting with 79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42 not found: ID does not exist" containerID="79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42" Apr 20 20:35:54.142846 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.142808 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42"} err="failed to get container status \"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42\": rpc error: code = NotFound desc = could not find container \"79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42\": container with ID starting with 79751759ef938b0412666ee23b8263edbdb90b10f3dd929f283160a845eaea42 not found: ID does not exist" Apr 20 20:35:54.184481 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.184459 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls\") pod \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " Apr 20 20:35:54.184565 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.184494 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml6wh\" (UniqueName: \"kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh\") pod \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " Apr 20 20:35:54.184565 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.184529 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\" (UID: \"2270c01b-2bf1-48db-a32d-6f73b6d0dd32\") " Apr 20 20:35:54.184875 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.184853 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ab6b8-kube-rbac-proxy-sar-config") pod "2270c01b-2bf1-48db-a32d-6f73b6d0dd32" (UID: "2270c01b-2bf1-48db-a32d-6f73b6d0dd32"). InnerVolumeSpecName "success-200-isvc-ab6b8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:54.186335 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.186310 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2270c01b-2bf1-48db-a32d-6f73b6d0dd32" (UID: "2270c01b-2bf1-48db-a32d-6f73b6d0dd32"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:54.186494 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.186477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh" (OuterVolumeSpecName: "kube-api-access-ml6wh") pod "2270c01b-2bf1-48db-a32d-6f73b6d0dd32" (UID: "2270c01b-2bf1-48db-a32d-6f73b6d0dd32"). InnerVolumeSpecName "kube-api-access-ml6wh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:35:54.285294 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.285270 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:54.285294 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.285294 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ml6wh\" (UniqueName: \"kubernetes.io/projected/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-kube-api-access-ml6wh\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:54.285454 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.285305 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2270c01b-2bf1-48db-a32d-6f73b6d0dd32-success-200-isvc-ab6b8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:35:54.452356 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.452328 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:35:54.456290 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.456243 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ab6b8-predictor-6857df4dd-4wr8k"] Apr 20 20:35:54.731012 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:54.730932 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" path="/var/lib/kubelet/pods/2270c01b-2bf1-48db-a32d-6f73b6d0dd32/volumes" Apr 20 20:35:58.036920 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:58.036881 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 20 20:35:58.128868 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:58.128844 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:35:58.129286 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:35:58.129238 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:36:08.037420 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:36:08.037340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:36:08.129702 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:36:08.129666 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:36:18.129533 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:36:18.129495 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:36:28.129473 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:36:28.129429 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 20 20:36:38.130214 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:36:38.130186 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:39:52.756381 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:39:52.756351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:39:52.761367 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:39:52.761343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:44:52.776598 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:44:52.776566 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:44:52.780454 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:44:52.780429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:45:05.565609 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:05.565527 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:45:05.566009 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:05.565789 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" containerID="cri-o://7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59" gracePeriod=30 Apr 20 20:45:05.566009 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:05.565842 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kube-rbac-proxy" containerID="cri-o://c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333" gracePeriod=30 Apr 20 20:45:05.741788 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:05.741759 2575 generic.go:358] "Generic (PLEG): container finished" podID="bf768051-677e-4649-bf94-78323dfd123e" containerID="c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333" exitCode=2 Apr 20 20:45:05.741920 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:05.741811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerDied","Data":"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333"} Apr 20 20:45:08.125876 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.125836 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.28:8643/healthz\": dial tcp 10.132.0.28:8643: connect: connection refused" Apr 20 20:45:08.202481 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.202460 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:45:08.255773 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.255752 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") pod \"bf768051-677e-4649-bf94-78323dfd123e\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " Apr 20 20:45:08.255886 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.255790 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rbwl\" (UniqueName: \"kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl\") pod \"bf768051-677e-4649-bf94-78323dfd123e\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " Apr 20 20:45:08.255886 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.255825 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config\") pod \"bf768051-677e-4649-bf94-78323dfd123e\" (UID: \"bf768051-677e-4649-bf94-78323dfd123e\") " Apr 20 20:45:08.256185 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.256162 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-57557-kube-rbac-proxy-sar-config") pod "bf768051-677e-4649-bf94-78323dfd123e" (UID: "bf768051-677e-4649-bf94-78323dfd123e"). InnerVolumeSpecName "success-200-isvc-57557-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:45:08.257798 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.257773 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf768051-677e-4649-bf94-78323dfd123e" (UID: "bf768051-677e-4649-bf94-78323dfd123e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:45:08.257906 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.257889 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl" (OuterVolumeSpecName: "kube-api-access-4rbwl") pod "bf768051-677e-4649-bf94-78323dfd123e" (UID: "bf768051-677e-4649-bf94-78323dfd123e"). InnerVolumeSpecName "kube-api-access-4rbwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:45:08.357027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.356972 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf768051-677e-4649-bf94-78323dfd123e-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:45:08.357027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.356994 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rbwl\" (UniqueName: \"kubernetes.io/projected/bf768051-677e-4649-bf94-78323dfd123e-kube-api-access-4rbwl\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:45:08.357027 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.357005 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf768051-677e-4649-bf94-78323dfd123e-success-200-isvc-57557-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:45:08.752598 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.752567 2575 generic.go:358] "Generic (PLEG): container finished" podID="bf768051-677e-4649-bf94-78323dfd123e" containerID="7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59" exitCode=0 Apr 20 20:45:08.752734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.752617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerDied","Data":"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59"} Apr 20 20:45:08.752734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.752640 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" Apr 20 20:45:08.752734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.752652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" event={"ID":"bf768051-677e-4649-bf94-78323dfd123e","Type":"ContainerDied","Data":"7c90bf571262f2dbea119b21e25aa6f52c292fd4c673fc568b7b14f69b026cc2"} Apr 20 20:45:08.752734 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.752673 2575 scope.go:117] "RemoveContainer" containerID="c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333" Apr 20 20:45:08.760441 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.760396 2575 scope.go:117] "RemoveContainer" containerID="7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59" Apr 20 20:45:08.766984 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.766961 2575 scope.go:117] "RemoveContainer" containerID="c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333" Apr 20 20:45:08.767562 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:45:08.767541 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333\": container with ID starting with c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333 not found: ID does not exist" containerID="c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333" Apr 20 20:45:08.767686 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.767574 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333"} err="failed to get container status \"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333\": rpc error: code = NotFound desc = could not find container \"c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333\": container with ID starting with c01531ff299899b49edbc26980ed4760932454195b283500366dbb5e4cdee333 not found: ID does not exist" Apr 20 20:45:08.767686 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.767593 2575 scope.go:117] "RemoveContainer" containerID="7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59" Apr 20 20:45:08.767880 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:45:08.767860 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59\": container with ID starting with 7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59 not found: ID does not exist" containerID="7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59" Apr 20 20:45:08.767943 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.767889 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59"} err="failed to get container status \"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59\": rpc error: code = NotFound desc = could not find container \"7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59\": container with ID starting with 7c43e1baef799b575092fe97a0959837166aafabb5991eb5d3f34e61da7e7c59 not found: ID does not exist" Apr 20 20:45:08.768511 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.768489 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:45:08.772266 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:08.772232 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd"] Apr 20 20:45:09.130935 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:09.130898 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-57557-predictor-7cdcf47b7b-75nzd" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: i/o timeout" Apr 20 20:45:10.731390 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:45:10.731356 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf768051-677e-4649-bf94-78323dfd123e" path="/var/lib/kubelet/pods/bf768051-677e-4649-bf94-78323dfd123e/volumes" Apr 20 20:49:52.796688 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:49:52.796581 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:49:52.800934 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:49:52.800910 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:52:40.023167 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:40.023086 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:52:40.023685 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:40.023451 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" containerID="cri-o://ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b" gracePeriod=30 Apr 20 20:52:40.023685 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:40.023497 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kube-rbac-proxy" containerID="cri-o://e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258" gracePeriod=30 Apr 20 20:52:41.045438 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:41.045405 2575 generic.go:358] "Generic (PLEG): container finished" podID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerID="e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258" exitCode=2 Apr 20 20:52:41.045438 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:41.045443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerDied","Data":"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258"} Apr 20 20:52:42.460482 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.460458 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:52:42.611373 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.611338 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") pod \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " Apr 20 20:52:42.611517 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.611389 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f989c\" (UniqueName: \"kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c\") pod \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " Apr 20 20:52:42.611517 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.611414 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\" (UID: \"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6\") " Apr 20 20:52:42.611799 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.611766 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-60a5c-kube-rbac-proxy-sar-config") pod "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" (UID: "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6"). InnerVolumeSpecName "success-200-isvc-60a5c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:52:42.613500 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.613468 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" (UID: "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:52:42.613601 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.613574 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c" (OuterVolumeSpecName: "kube-api-access-f989c") pod "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" (UID: "103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6"). InnerVolumeSpecName "kube-api-access-f989c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:52:42.712496 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.712458 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-proxy-tls\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:52:42.712496 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.712495 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f989c\" (UniqueName: \"kubernetes.io/projected/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-kube-api-access-f989c\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:52:42.712496 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:42.712506 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6-success-200-isvc-60a5c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-19.ec2.internal\" DevicePath \"\"" Apr 20 20:52:43.052760 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.052723 2575 generic.go:358] "Generic (PLEG): container finished" podID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerID="ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b" exitCode=0 Apr 20 20:52:43.052883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.052815 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" Apr 20 20:52:43.052883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.052814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerDied","Data":"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b"} Apr 20 20:52:43.052883 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.052868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t" event={"ID":"103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6","Type":"ContainerDied","Data":"3e54fa7de42ed5f3d7c90661ffdd71664b251707d7f8580026c4b50068a7a324"} Apr 20 20:52:43.052994 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.052894 2575 scope.go:117] "RemoveContainer" containerID="e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258" Apr 20 20:52:43.061390 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.061355 2575 scope.go:117] "RemoveContainer" containerID="ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b" Apr 20 20:52:43.068517 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.068497 2575 scope.go:117] "RemoveContainer" containerID="e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258" Apr 20 20:52:43.068861 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:52:43.068823 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258\": container with ID starting with e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258 not found: ID does not exist" containerID="e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258" Apr 20 20:52:43.068964 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.068868 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258"} err="failed to get container status \"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258\": rpc error: code = NotFound desc = could not find container \"e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258\": container with ID starting with e3632519f1570c03452dc6ef56a94e41fc80651bba5da34d413d9e6326f39258 not found: ID does not exist" Apr 20 20:52:43.068964 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.068888 2575 scope.go:117] "RemoveContainer" containerID="ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b" Apr 20 20:52:43.069456 ip-10-0-140-19 kubenswrapper[2575]: E0420 20:52:43.069425 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b\": container with ID starting with ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b not found: ID does not exist" containerID="ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b" Apr 20 20:52:43.069589 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.069460 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b"} err="failed to get container status \"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b\": rpc error: code = NotFound desc = could not find container \"ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b\": container with ID starting with ee881f2af2c2aa47eb7ab19147409e7531d6d30d8bed29da2391eaa7a51f3d5b not found: ID does not exist" Apr 20 20:52:43.070987 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.070958 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:52:43.078736 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:43.078712 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60a5c-predictor-5fbfb7dd4-hv76t"] Apr 20 20:52:44.731632 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:52:44.731599 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" path="/var/lib/kubelet/pods/103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6/volumes" Apr 20 20:53:05.276409 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276373 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2h9np/must-gather-x4jss"] Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276614 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276625 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276635 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276641 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276650 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276656 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276664 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276669 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276683 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276688 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276695 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276701 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276744 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276751 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276758 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kserve-container" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276764 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf768051-677e-4649-bf94-78323dfd123e" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276770 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2270c01b-2bf1-48db-a32d-6f73b6d0dd32" containerName="kube-rbac-proxy" Apr 20 20:53:05.276811 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.276776 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="103ac5eb-fe4a-4dd4-9fda-e59c1d9263d6" containerName="kube-rbac-proxy" Apr 20 20:53:05.279795 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.279778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.282906 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.282879 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"kube-root-ca.crt\"" Apr 20 20:53:05.284394 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.284366 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"openshift-service-ca.crt\"" Apr 20 20:53:05.284639 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.284620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2h9np\"/\"default-dockercfg-9jbc7\"" Apr 20 20:53:05.285583 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.285560 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/must-gather-x4jss"] Apr 20 20:53:05.356355 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.356325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a84fb812-c8fd-436a-805c-d609e12d7369-must-gather-output\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.356470 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.356394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5k4\" (UniqueName: \"kubernetes.io/projected/a84fb812-c8fd-436a-805c-d609e12d7369-kube-api-access-mg5k4\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.457682 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.457649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5k4\" (UniqueName: \"kubernetes.io/projected/a84fb812-c8fd-436a-805c-d609e12d7369-kube-api-access-mg5k4\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.457794 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.457701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a84fb812-c8fd-436a-805c-d609e12d7369-must-gather-output\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.458025 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.458003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a84fb812-c8fd-436a-805c-d609e12d7369-must-gather-output\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.465833 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.465810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5k4\" (UniqueName: \"kubernetes.io/projected/a84fb812-c8fd-436a-805c-d609e12d7369-kube-api-access-mg5k4\") pod \"must-gather-x4jss\" (UID: \"a84fb812-c8fd-436a-805c-d609e12d7369\") " pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.590033 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.589972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/must-gather-x4jss" Apr 20 20:53:05.705193 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.705157 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/must-gather-x4jss"] Apr 20 20:53:05.707338 ip-10-0-140-19 kubenswrapper[2575]: W0420 20:53:05.707307 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda84fb812_c8fd_436a_805c_d609e12d7369.slice/crio-18655a6393aab3faf61e8a545f3c60d43f5a56192df6b2618a1814efba166f46 WatchSource:0}: Error finding container 18655a6393aab3faf61e8a545f3c60d43f5a56192df6b2618a1814efba166f46: Status 404 returned error can't find the container with id 18655a6393aab3faf61e8a545f3c60d43f5a56192df6b2618a1814efba166f46 Apr 20 20:53:05.709111 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:05.709092 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:53:06.124698 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:06.124658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/must-gather-x4jss" event={"ID":"a84fb812-c8fd-436a-805c-d609e12d7369","Type":"ContainerStarted","Data":"18655a6393aab3faf61e8a545f3c60d43f5a56192df6b2618a1814efba166f46"} Apr 20 20:53:07.131893 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:07.131124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/must-gather-x4jss" event={"ID":"a84fb812-c8fd-436a-805c-d609e12d7369","Type":"ContainerStarted","Data":"0596bc687688f42c0196ea03f3f20a86a08d99940b0a36815c3459980eaa00af"} Apr 20 20:53:07.131893 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:07.131166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/must-gather-x4jss" event={"ID":"a84fb812-c8fd-436a-805c-d609e12d7369","Type":"ContainerStarted","Data":"08a96a95671dd86f707798f6c5c176c7d458dc5505280a7c610025cb967048cf"} Apr 20 20:53:07.147405 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:07.147357 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2h9np/must-gather-x4jss" podStartSLOduration=1.359960121 podStartE2EDuration="2.147340019s" podCreationTimestamp="2026-04-20 20:53:05 +0000 UTC" firstStartedPulling="2026-04-20 20:53:05.70926763 +0000 UTC m=+2893.456187682" lastFinishedPulling="2026-04-20 20:53:06.496647528 +0000 UTC m=+2894.243567580" observedRunningTime="2026-04-20 20:53:07.145567995 +0000 UTC m=+2894.892488068" watchObservedRunningTime="2026-04-20 20:53:07.147340019 +0000 UTC m=+2894.894260089" Apr 20 20:53:07.866387 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:07.866354 2575 ???:1] "http: TLS handshake error from 10.0.130.227:55866: EOF" Apr 20 20:53:07.875417 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:07.875390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-j6rtl_135fcd8b-54c2-4eb1-bd4f-a23d6443033d/global-pull-secret-syncer/0.log" Apr 20 20:53:08.068434 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:08.068391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-shngg_6d199ee7-aff2-40a9-92ab-a7ddd2f94088/konnectivity-agent/0.log" Apr 20 20:53:08.141455 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:08.141373 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-19.ec2.internal_a1223cbd65770cd535fe3d6f40564646/haproxy/0.log" Apr 20 20:53:11.930240 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:11.930206 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc2vn_164da49e-606e-49b0-9bd5-8181919bd837/node-exporter/0.log" Apr 20 20:53:11.952943 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:11.952918 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc2vn_164da49e-606e-49b0-9bd5-8181919bd837/kube-rbac-proxy/0.log" Apr 20 20:53:11.972360 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:11.972334 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dc2vn_164da49e-606e-49b0-9bd5-8181919bd837/init-textfile/0.log" Apr 20 20:53:14.997050 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:14.997021 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k"] Apr 20 20:53:15.000786 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.000767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.013272 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.012596 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k"] Apr 20 20:53:15.037493 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.037468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-sys\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.037637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.037500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-proc\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.037637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.037531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdrk\" (UniqueName: \"kubernetes.io/projected/76f84680-acb5-4280-b450-0d7e3e740423-kube-api-access-vfdrk\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.037637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.037581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-lib-modules\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.037637 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.037603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-podres\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.138827 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdrk\" (UniqueName: \"kubernetes.io/projected/76f84680-acb5-4280-b450-0d7e3e740423-kube-api-access-vfdrk\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.138991 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-lib-modules\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.138991 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-podres\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.138991 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-podres\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.138991 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-lib-modules\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.139187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.138978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-sys\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.139187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.139002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-sys\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.139187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.139026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-proc\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.139187 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.139066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/76f84680-acb5-4280-b450-0d7e3e740423-proc\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.148471 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.148422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdrk\" (UniqueName: \"kubernetes.io/projected/76f84680-acb5-4280-b450-0d7e3e740423-kube-api-access-vfdrk\") pod \"perf-node-gather-daemonset-l4k7k\" (UID: \"76f84680-acb5-4280-b450-0d7e3e740423\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.311808 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.311730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:15.432393 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.432364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k"] Apr 20 20:53:15.586511 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.586460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7twh_77bd0885-5141-4657-8ae7-140bbc18a034/dns/0.log" Apr 20 20:53:15.605780 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.605757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7twh_77bd0885-5141-4657-8ae7-140bbc18a034/kube-rbac-proxy/0.log" Apr 20 20:53:15.711857 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:15.711838 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rlhqx_6c86c1d8-db05-4fb0-9906-f6e203ab0fc0/dns-node-resolver/0.log" Apr 20 20:53:16.135032 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:16.134996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qcx6p_0ffcd701-9c8c-423c-adee-9e708c55207b/node-ca/0.log" Apr 20 20:53:16.172056 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:16.172013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" event={"ID":"76f84680-acb5-4280-b450-0d7e3e740423","Type":"ContainerStarted","Data":"0e76a7858f4322f616646a4d6b7090c2a0f85feea77f74bfcc2db0074f75be70"} Apr 20 20:53:16.172270 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:16.172063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" event={"ID":"76f84680-acb5-4280-b450-0d7e3e740423","Type":"ContainerStarted","Data":"6d272dc4800642d2ef251cf3bfcbee42a99f2ee0846eb52926ac066e3033cd92"} Apr 20 20:53:16.172834 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:16.172802 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:16.189744 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:16.189692 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" podStartSLOduration=2.18967553 podStartE2EDuration="2.18967553s" podCreationTimestamp="2026-04-20 20:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:53:16.187422539 +0000 UTC m=+2903.934342611" watchObservedRunningTime="2026-04-20 20:53:16.18967553 +0000 UTC m=+2903.936595602" Apr 20 20:53:17.168978 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:17.168949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2kdv4_c8c422a0-bcac-4afe-96fe-8b9874ed46e5/serve-healthcheck-canary/0.log" Apr 20 20:53:17.637186 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:17.637153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rsxtz_e848e523-e953-4cef-b352-34e3d9adf16c/kube-rbac-proxy/0.log" Apr 20 20:53:17.656668 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:17.656644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rsxtz_e848e523-e953-4cef-b352-34e3d9adf16c/exporter/0.log" Apr 20 20:53:17.676145 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:17.676122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rsxtz_e848e523-e953-4cef-b352-34e3d9adf16c/extractor/0.log" Apr 20 20:53:19.626519 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:19.626492 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-hdwvb_a85e939b-9f28-4c09-80c6-c476a342044c/server/0.log" Apr 20 20:53:19.910659 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:19.910588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-2f9qj_5e9c78a9-f304-4c12-be6d-e096d460953e/manager/0.log" Apr 20 20:53:19.953799 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:19.953773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-d4z2x_10c64b38-1d2a-4779-9775-beb30405b81f/seaweedfs/0.log" Apr 20 20:53:23.187638 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:23.187606 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-l4k7k" Apr 20 20:53:24.602942 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.602914 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9bpfb_07203ee1-3321-4aa2-bcf0-1688aa2911e0/kube-multus/0.log" Apr 20 20:53:24.633442 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.633419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/kube-multus-additional-cni-plugins/0.log" Apr 20 20:53:24.655553 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.655526 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/egress-router-binary-copy/0.log" Apr 20 20:53:24.675501 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.675466 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/cni-plugins/0.log" Apr 20 20:53:24.699291 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.699263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/bond-cni-plugin/0.log" Apr 20 20:53:24.720902 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.720873 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/routeoverride-cni/0.log" Apr 20 20:53:24.745854 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.745830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/whereabouts-cni-bincopy/0.log" Apr 20 20:53:24.767560 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:24.767529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5x687_397c50ea-cc56-4ac2-a69e-090e94977ed9/whereabouts-cni/0.log" Apr 20 20:53:25.132446 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:25.132376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gf2gx_de77e01d-c1e5-4a7e-99df-1261a9d21bed/network-metrics-daemon/0.log" Apr 20 20:53:25.154170 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:25.154133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gf2gx_de77e01d-c1e5-4a7e-99df-1261a9d21bed/kube-rbac-proxy/0.log" Apr 20 20:53:26.665031 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.665003 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-controller/0.log" Apr 20 20:53:26.687407 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.687370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/0.log" Apr 20 20:53:26.702765 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.702735 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovn-acl-logging/1.log" Apr 20 20:53:26.723425 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.723392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/kube-rbac-proxy-node/0.log" Apr 20 20:53:26.743787 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.743763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:53:26.759896 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.759873 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/northd/0.log" Apr 20 20:53:26.779505 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.779487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/nbdb/0.log" Apr 20 20:53:26.797958 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.797937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/sbdb/0.log" Apr 20 20:53:26.915465 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:26.915392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwd67_f71db6ea-fd4d-4236-94af-1a0b3a3c623f/ovnkube-controller/0.log" Apr 20 20:53:27.928868 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:27.928806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rn94w_b638336f-46b0-4174-be51-b9aa9a0f9341/network-check-target-container/0.log" Apr 20 20:53:28.713921 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:28.713895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-bb4hw_73708303-ce7f-455e-9004-696bd9c8fe9f/iptables-alerter/0.log" Apr 20 20:53:29.351493 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:29.351463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-mnj4w_57392845-ccc8-4912-8291-1fd220cb1fee/tuned/0.log" Apr 20 20:53:32.603614 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:32.603587 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zmp6c_32a74b02-cb51-4f5d-91a3-63f0dac5b718/csi-driver/0.log" Apr 20 20:53:32.622805 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:32.622753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zmp6c_32a74b02-cb51-4f5d-91a3-63f0dac5b718/csi-node-driver-registrar/0.log" Apr 20 20:53:32.642086 ip-10-0-140-19 kubenswrapper[2575]: I0420 20:53:32.642053 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zmp6c_32a74b02-cb51-4f5d-91a3-63f0dac5b718/csi-liveness-probe/0.log"