Apr 16 18:10:38.488369 ip-10-0-141-192 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:10:38.972424 ip-10-0-141-192 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:38.972424 ip-10-0-141-192 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:10:38.972424 ip-10-0-141-192 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:38.972424 ip-10-0-141-192 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:10:38.972424 ip-10-0-141-192 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:38.975274 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.975190 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:10:38.978495 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978480 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:38.978495 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978495 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978499 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978503 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978506 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978509 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978511 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978514 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978517 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978520 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978522 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978525 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978533 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978536 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978540 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978542 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978545 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978548 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978550 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978553 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978556 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:38.978562 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978559 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978562 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978564 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978567 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978572 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978576 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978579 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978581 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978584 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978587 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978589 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978593 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978596 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978599 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978602 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978604 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978607 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978610 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978612 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:38.979027 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978615 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978618 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978620 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978623 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978625 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978628 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978630 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978633 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978635 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978637 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978640 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978643 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978645 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978648 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978651 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978656 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978659 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978661 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978664 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:38.979506 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978667 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978669 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978672 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978675 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978677 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978680 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978683 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978691 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978694 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978697 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978699 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978702 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978704 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978707 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978711 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978713 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978716 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978718 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978721 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978723 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:38.979965 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978726 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978728 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978731 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978734 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978736 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978738 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.978741 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979190 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979196 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979199 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979201 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979204 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979207 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979210 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979213 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979215 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979219 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979223 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979226 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:38.980458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979235 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979238 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979240 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979243 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979245 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979248 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979251 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979254 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979256 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979259 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979261 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979264 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979267 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979269 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979271 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979274 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979277 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979279 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979284 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:38.980913 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979287 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979291 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979294 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979297 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979300 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979303 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979306 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979309 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979312 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979314 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979317 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979319 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979322 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979324 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979332 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979335 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979338 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979341 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979343 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979345 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:38.981400 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979348 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979350 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979353 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979356 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979358 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979361 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979363 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979366 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979368 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979371 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979373 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979376 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979379 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979382 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979385 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979387 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979390 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979393 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979395 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979398 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:38.981881 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979400 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979403 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979406 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979408 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979411 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979413 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979416 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979425 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979428 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979431 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979433 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979436 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979438 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979441 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.979443 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980330 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980340 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980350 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980355 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980359 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980363 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:10:38.982388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980367 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980371 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980374 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980377 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980381 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980386 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980389 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980392 2567 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980395 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980398 2567 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980401 2567 flags.go:64] FLAG: --cloud-config="" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980404 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980407 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980413 2567 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980416 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980419 2567 flags.go:64] FLAG: --config-dir="" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980422 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980426 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980430 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980439 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980442 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980445 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980448 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980452 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:10:38.982903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980455 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980458 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980461 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980465 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980468 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980471 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980474 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980477 2567 flags.go:64] FLAG: --enable-server="true" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980480 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980488 2567 flags.go:64] FLAG: --event-burst="100" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980491 2567 flags.go:64] FLAG: --event-qps="50" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980494 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980501 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980504 2567 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980508 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980511 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980515 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980518 2567 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980520 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980523 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980526 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980529 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980532 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980535 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980538 2567 flags.go:64] FLAG: --feature-gates="" Apr 16 18:10:38.983533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980542 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980545 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980548 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980558 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980561 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980564 2567 flags.go:64] FLAG: --help="false" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980567 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-141-192.ec2.internal" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980571 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980574 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980577 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980580 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980583 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980587 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980590 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980593 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980595 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980598 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980602 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980605 2567 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980609 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980612 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980615 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980618 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980621 2567 flags.go:64] FLAG: --lock-file="" Apr 16 18:10:38.984148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980624 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980626 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980629 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980634 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980638 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980640 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980643 2567 flags.go:64] FLAG: --logging-format="text" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980646 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980651 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980654 2567 flags.go:64] FLAG: --manifest-url="" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980657 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980661 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980664 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980669 2567 flags.go:64] FLAG: --max-pods="110" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980672 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980675 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980678 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980681 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980684 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980687 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980690 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980701 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980704 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980707 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:10:38.984755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.980710 2567 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981544 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981552 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981556 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981560 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981564 2567 flags.go:64] FLAG: --port="10250" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981567 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981570 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-026b9533f75e5c04a" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981573 2567 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981576 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981580 2567 flags.go:64] FLAG: --register-node="true" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981583 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981585 2567 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981593 2567 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981596 2567 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981599 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981601 2567 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981605 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981608 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981611 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981614 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981617 2567 flags.go:64] FLAG: --runonce="false" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981620 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981623 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981626 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:10:38.985352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981629 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981633 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981636 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981639 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981642 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981645 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981647 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981650 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981653 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981656 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981659 2567 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981662 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981668 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981671 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981674 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981680 2567 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981683 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981686 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981689 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981692 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981695 2567 flags.go:64] FLAG: --v="2" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981699 2567 flags.go:64] FLAG: --version="false" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981703 2567 flags.go:64] FLAG: --vmodule="" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981708 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.981711 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:10:38.985960 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981822 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981825 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981828 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981832 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981836 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981838 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981841 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981844 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981847 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981850 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981852 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981855 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981858 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981860 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981863 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981866 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981869 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981871 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981874 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981877 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:38.986626 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981879 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981884 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981887 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981891 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981895 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981897 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981900 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981902 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981905 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981908 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981910 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981913 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981915 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981918 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981920 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981923 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981925 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981933 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981937 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:38.987174 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981939 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981942 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981944 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981947 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981949 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981952 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981955 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981957 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981960 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981962 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981965 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981968 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981970 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981973 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981975 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981978 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981981 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981983 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981986 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:38.987689 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981988 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981991 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981993 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981996 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.981999 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982001 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982004 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982006 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982009 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982011 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982014 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982016 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982025 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982027 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982030 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982033 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982035 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982052 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982056 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982059 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:38.988168 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982062 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982064 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982067 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982071 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982074 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982076 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982079 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.982082 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.982087 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.988633 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:10:38.988663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.988648 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988694 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988700 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988703 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988706 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988709 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988712 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988714 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988717 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988719 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988722 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988725 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988728 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988730 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988733 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988736 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988740 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988744 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988747 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:38.988918 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988750 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988753 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988756 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988759 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988762 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988765 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988767 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988770 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988773 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988775 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988778 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988780 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988783 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988787 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988790 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988793 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988796 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988799 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988801 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988804 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:38.989416 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988807 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988810 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988812 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988815 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988817 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988820 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988823 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988825 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988828 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988832 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988836 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988839 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988841 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988844 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988847 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988849 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988852 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988854 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988857 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:38.989947 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988860 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988862 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988865 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988867 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988870 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988873 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988875 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988878 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988881 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988883 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988886 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988889 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988891 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988894 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988896 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988899 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988901 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988904 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988907 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988910 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:38.990422 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988912 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988915 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988917 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988920 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988923 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988926 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988929 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988931 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.988934 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.988939 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989085 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989091 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989095 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989098 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989101 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989104 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:38.990903 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989107 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989110 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989113 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989116 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989125 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989128 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989131 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989134 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989136 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989139 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989141 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989144 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989147 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989149 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989152 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989155 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989158 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989160 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989163 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989165 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:38.991320 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989168 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989170 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989173 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989175 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989178 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989180 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989183 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989185 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989188 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989191 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989193 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989196 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989198 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989201 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989203 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989206 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989209 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989220 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989223 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989226 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:38.991805 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989228 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989231 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989234 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989236 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989239 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989242 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989244 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989247 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989251 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989254 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989258 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989261 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989263 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989266 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989269 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989272 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989274 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989277 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989280 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:38.992373 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989283 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989286 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989289 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989291 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989295 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989297 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989300 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989302 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989305 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989315 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989318 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989328 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989331 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989334 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989336 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989339 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989342 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989344 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989348 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:38.992834 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989351 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:38.993332 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:38.989354 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:38.993332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.989359 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:38.993332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.990267 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:10:38.993332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.992673 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:10:38.993805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.993793 2567 server.go:1019] "Starting client certificate rotation" Apr 16 18:10:38.993910 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.993892 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:38.994544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:38.994533 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:39.024964 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.024945 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:39.030626 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.027997 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:39.050880 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.050859 2567 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:10:39.056784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.056766 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:39.057909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.057893 2567 log.go:25] "Validated CRI v1 image API" Apr 16 18:10:39.059471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.059449 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:10:39.064392 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.064374 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c013f639-1b03-430b-8d59-2dc4f225196f:/dev/nvme0n1p3 ece79215-1fba-4bcc-85ac-08728c4f32eb:/dev/nvme0n1p4] Apr 16 18:10:39.064442 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.064391 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:10:39.070306 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.070201 2567 manager.go:217] Machine: {Timestamp:2026-04-16 18:10:39.068365217 +0000 UTC m=+0.452028678 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3123661 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ab57ae4f7f54947b5521b49b7f35a SystemUUID:ec2ab57a-e4f7-f549-47b5-521b49b7f35a BootID:eaa44851-4d4f-4aef-9193-7e9d422a6da7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:ea:db:b9:ad Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:ea:db:b9:ad Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:3f:60:4c:f2:59 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:10:39.070306 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.070301 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:10:39.070408 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.070400 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:10:39.072480 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.072458 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:10:39.072612 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.072481 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-192.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:10:39.072658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.072618 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:10:39.072658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.072627 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:10:39.072658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.072639 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:39.073737 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.073727 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:39.075437 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.075427 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:39.075539 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.075530 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:10:39.077499 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.077485 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wqhpw" Apr 16 18:10:39.079001 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.078991 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:10:39.079051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.079005 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:10:39.079051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.079017 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:10:39.079119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.079065 2567 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:10:39.079119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.079077 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:10:39.080337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.080325 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:39.080390 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.080343 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:39.082184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.082166 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wqhpw" Apr 16 18:10:39.084801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.084782 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:10:39.089639 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.089613 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:10:39.091620 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091602 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091625 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091632 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091638 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091643 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091650 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091655 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091661 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091668 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091674 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091698 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:10:39.091745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.091709 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:10:39.093729 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.093715 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:10:39.093729 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.093729 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:10:39.094400 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.094384 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:39.096099 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.096082 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:39.097275 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.097263 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:10:39.097324 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.097304 2567 server.go:1295] "Started kubelet" Apr 16 18:10:39.097419 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.097397 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:10:39.097481 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.097431 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:10:39.097518 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.097504 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:10:39.098021 ip-10-0-141-192 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:10:39.098716 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.098666 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:10:39.099335 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.099322 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:10:39.099944 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.099916 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-192.ec2.internal" not found Apr 16 18:10:39.108333 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.108307 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:39.108842 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.108828 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:10:39.109453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109434 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:10:39.109453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109437 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:10:39.109453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109462 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:10:39.109659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109583 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:10:39.109659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109592 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:10:39.109659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109631 2567 factory.go:55] Registering systemd factory Apr 16 18:10:39.109659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109646 2567 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:10:39.110006 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.109846 2567 factory.go:153] Registering CRI-O factory Apr 16 18:10:39.110006 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110006 2567 factory.go:223] Registration of the crio container factory successfully Apr 16 18:10:39.110467 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.109900 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-192.ec2.internal\" not found" Apr 16 18:10:39.110547 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.109953 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:10:39.110547 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110479 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:10:39.110547 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110500 2567 factory.go:103] Registering Raw factory Apr 16 18:10:39.110547 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110519 2567 manager.go:1196] Started watching for new ooms in manager Apr 16 18:10:39.110801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110775 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:39.110996 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.110980 2567 manager.go:319] Starting recovery of all containers Apr 16 18:10:39.113257 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.113112 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-192.ec2.internal\" not found" node="ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.113515 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.113498 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-192.ec2.internal" not found Apr 16 18:10:39.121259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.121239 2567 manager.go:324] Recovery completed Apr 16 18:10:39.123677 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.123652 2567 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:10:39.126506 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.126495 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:39.128355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128341 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:39.128410 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128368 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:39.128410 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128377 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:39.128787 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128775 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:10:39.128838 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128787 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:10:39.128838 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.128803 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:39.132890 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.132877 2567 policy_none.go:49] "None policy: Start" Apr 16 18:10:39.132890 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.132893 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:10:39.132971 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.132903 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:10:39.170537 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.170521 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-192.ec2.internal" not found Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.177597 2567 manager.go:341] "Starting Device Plugin manager" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.177631 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.177640 2567 server.go:85] "Starting device plugin registration server" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.177835 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.177844 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.177962 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.178087 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.178097 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.178501 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:10:39.189678 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.178545 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-192.ec2.internal\" not found" Apr 16 18:10:39.235359 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.235289 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:10:39.236539 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.236521 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:10:39.236651 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.236549 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:10:39.236651 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.236568 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:10:39.236651 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.236575 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:10:39.236651 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:39.236616 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:10:39.238345 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.238329 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:39.278284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.278249 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:39.279135 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.279120 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:39.279220 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.279150 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:39.279220 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.279160 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:39.279220 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.279182 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.286585 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.286572 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.337091 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.337033 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal"] Apr 16 18:10:39.340317 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.340296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.340445 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.340428 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.360583 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.360562 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.364912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.364896 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.373420 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.373404 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:39.378529 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.378514 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:39.410664 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.410636 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.410799 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.410670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.410799 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.410706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b61c38e01c9c0b2139dda9e09bff1f5-config\") pod \"kube-apiserver-proxy-ip-10-0-141-192.ec2.internal\" (UID: \"3b61c38e01c9c0b2139dda9e09bff1f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.510950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b61c38e01c9c0b2139dda9e09bff1f5-config\") pod \"kube-apiserver-proxy-ip-10-0-141-192.ec2.internal\" (UID: \"3b61c38e01c9c0b2139dda9e09bff1f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.510950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.510950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.511176 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.511176 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b61c38e01c9c0b2139dda9e09bff1f5-config\") pod \"kube-apiserver-proxy-ip-10-0-141-192.ec2.internal\" (UID: \"3b61c38e01c9c0b2139dda9e09bff1f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.511176 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.510957 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ddd6b3915ea610317bcd527288295ec4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal\" (UID: \"ddd6b3915ea610317bcd527288295ec4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.678191 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.678152 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.681887 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.681868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" Apr 16 18:10:39.993095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.992989 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:10:39.993794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.993147 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:39.993794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.993168 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:39.993794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:39.993147 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:40.079651 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.079621 2567 apiserver.go:52] "Watching apiserver" Apr 16 18:10:40.084685 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.084643 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:05:39 +0000 UTC" deadline="2027-10-26 05:35:18.449563675 +0000 UTC" Apr 16 18:10:40.084685 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.084683 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13379h24m38.364884295s" Apr 16 18:10:40.089798 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.089779 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:10:40.090791 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.090767 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-l8nt5","openshift-ovn-kubernetes/ovnkube-node-fk24r","kube-system/konnectivity-agent-bdf4x","kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal","openshift-cluster-node-tuning-operator/tuned-2p2bw","openshift-image-registry/node-ca-5zgqq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal","openshift-multus/multus-nth7f","openshift-multus/network-metrics-daemon-8w622","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7","openshift-dns/node-resolver-9pxxv","openshift-multus/multus-additional-cni-plugins-mq5wk","openshift-network-diagnostics/network-check-target-9tn4q"] Apr 16 18:10:40.095017 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.094998 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.096493 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.096463 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.096729 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.096681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.097464 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.097439 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.097570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.097466 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7rjld\"" Apr 16 18:10:40.097839 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.097826 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.097895 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.097881 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:10:40.098995 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.098655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.100920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.100751 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.100920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.100780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.100920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.100890 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:10:40.100920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.100898 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:10:40.101166 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.100903 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:10:40.101166 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.101005 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:10:40.101302 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.101264 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:10:40.101428 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.101371 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:10:40.101496 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.101455 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.101496 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.101463 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rdpj2\"" Apr 16 18:10:40.102253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.102196 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gtcbv\"" Apr 16 18:10:40.102361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.102284 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.102361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.102285 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.102361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.102326 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gkmbz\"" Apr 16 18:10:40.103796 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.103776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rppnt\"" Apr 16 18:10:40.104080 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.104035 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.104171 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.104156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.104235 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.104178 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:10:40.104384 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.104158 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.106415 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106397 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.106533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106515 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x7chk\"" Apr 16 18:10:40.106626 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106611 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:10:40.106732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106671 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.106732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106685 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.106837 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.106739 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:10:40.106837 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.106741 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:40.108178 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.108159 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.108382 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.108368 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:40.109590 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.109575 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.111183 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.111164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.111621 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.111603 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:10:40.111621 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.111617 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.111745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.111733 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-slxgm\"" Apr 16 18:10:40.111805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.111794 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.112367 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.112354 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2phrw\"" Apr 16 18:10:40.112575 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.112561 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:40.112620 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.112606 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:40.112701 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.112686 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:10:40.112751 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.112691 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:10:40.114095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114076 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:10:40.114198 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114130 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:10:40.114198 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114170 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h459l\"" Apr 16 18:10:40.114407 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-netns\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6twzf\" (UniqueName: \"kubernetes.io/projected/b488ff1d-ff75-45f2-8473-6f48445e1b55-kube-api-access-6twzf\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114432 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-conf\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.114472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-system-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114481 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-multus\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114497 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114528 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-slash\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-node-log\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-env-overrides\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-etc-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-conf-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114642 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-netd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114671 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjk7z\" (UniqueName: \"kubernetes.io/projected/185c47d9-cd61-422a-b3f0-0a6dd4148756-kube-api-access-wjk7z\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-kubernetes\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-cnibin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-hostroot\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114754 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-multus-certs\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114772 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxnh\" (UniqueName: \"kubernetes.io/projected/059c23ce-c3ea-4b83-a8a4-3c537435306e-kube-api-access-qdxnh\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.114856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-bin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-systemd\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-var-lib-kubelet\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-k8s-cni-cncf-io\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.114993 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsn7\" (UniqueName: \"kubernetes.io/projected/a8a6a8c0-5642-4be5-90d3-2827312267c3-kube-api-access-7vsn7\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gj4n\" (UniqueName: \"kubernetes.io/projected/9a432113-3c33-4a2e-970d-baa3beba7cc7-kube-api-access-7gj4n\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tklw\" (UniqueName: \"kubernetes.io/projected/0189e11b-ff99-45bd-a5b2-5f0873332309-kube-api-access-5tklw\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-run\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115102 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-lib-modules\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115124 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-tuned\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a432113-3c33-4a2e-970d-baa3beba7cc7-hosts-file\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-kubelet\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-ovn\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a423aaa2-c387-4ccb-a9ca-a627b634154d-konnectivity-ca\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115248 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-host\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.115307 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-sys\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115314 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysconfig\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-socket-dir-parent\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115365 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9sp\" (UniqueName: \"kubernetes.io/projected/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-kube-api-access-7n9sp\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdmw\" (UniqueName: \"kubernetes.io/projected/7da2a734-1f23-41a0-a455-b5e1b7871e27-kube-api-access-dhdmw\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-var-lib-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovn-node-metrics-cert\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115491 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-serviceca\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/185c47d9-cd61-422a-b3f0-0a6dd4148756-host-slash\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-modprobe-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-kubelet\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-daemon-config\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67vr\" (UniqueName: \"kubernetes.io/projected/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kube-api-access-p67vr\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-host\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115698 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-os-release\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.115916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/185c47d9-cd61-422a-b3f0-0a6dd4148756-iptables-alerter-script\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-cni-binary-copy\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115805 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-systemd-units\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-tmp\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-etc-kubernetes\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a432113-3c33-4a2e-970d-baa3beba7cc7-tmp-dir\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115911 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-os-release\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115926 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-systemd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115948 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-bin\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115968 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-config\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115982 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-script-lib\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.115997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-log-socket\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116037 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a423aaa2-c387-4ccb-a9ca-a627b634154d-agent-certs\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-device-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-etc-selinux\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-cnibin\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-netns\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116204 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-socket-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-registration-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-sys-fs\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.116898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.116249 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.119340 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.119322 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:40.175208 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.175162 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd6b3915ea610317bcd527288295ec4.slice/crio-8a27e5b92bcd61811dfcd0c491d318a9731690ac1133d31e8fedb7fbdf193327 WatchSource:0}: Error finding container 8a27e5b92bcd61811dfcd0c491d318a9731690ac1133d31e8fedb7fbdf193327: Status 404 returned error can't find the container with id 8a27e5b92bcd61811dfcd0c491d318a9731690ac1133d31e8fedb7fbdf193327 Apr 16 18:10:40.175430 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.175413 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b61c38e01c9c0b2139dda9e09bff1f5.slice/crio-f72487960d70973934d8f7a75f8fe74157b432943b8a399bd27788332f80db65 WatchSource:0}: Error finding container f72487960d70973934d8f7a75f8fe74157b432943b8a399bd27788332f80db65: Status 404 returned error can't find the container with id f72487960d70973934d8f7a75f8fe74157b432943b8a399bd27788332f80db65 Apr 16 18:10:40.180199 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.180184 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:40.188967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.188949 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-78lq7" Apr 16 18:10:40.199185 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.199164 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-78lq7" Apr 16 18:10:40.210913 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.210893 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:10:40.216405 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216388 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gj4n\" (UniqueName: \"kubernetes.io/projected/9a432113-3c33-4a2e-970d-baa3beba7cc7-kube-api-access-7gj4n\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.216477 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tklw\" (UniqueName: \"kubernetes.io/projected/0189e11b-ff99-45bd-a5b2-5f0873332309-kube-api-access-5tklw\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.216477 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-run\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216477 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-lib-modules\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-tuned\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a432113-3c33-4a2e-970d-baa3beba7cc7-hosts-file\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-kubelet\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216544 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-ovn\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216570 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a423aaa2-c387-4ccb-a9ca-a627b634154d-konnectivity-ca\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-lib-modules\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-host\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.216616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-sys\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysconfig\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-ovn\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-socket-dir-parent\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216689 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9sp\" (UniqueName: \"kubernetes.io/projected/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-kube-api-access-7n9sp\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-sys\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216707 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysconfig\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216719 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216745 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdmw\" (UniqueName: \"kubernetes.io/projected/7da2a734-1f23-41a0-a455-b5e1b7871e27-kube-api-access-dhdmw\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-host\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-socket-dir-parent\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216770 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-var-lib-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a432113-3c33-4a2e-970d-baa3beba7cc7-hosts-file\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-var-lib-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216817 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovn-node-metrics-cert\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-run\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-serviceca\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216830 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:10:40.216975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/185c47d9-cd61-422a-b3f0-0a6dd4148756-host-slash\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-modprobe-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-kubelet\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/185c47d9-cd61-422a-b3f0-0a6dd4148756-host-slash\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.216984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-daemon-config\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p67vr\" (UniqueName: \"kubernetes.io/projected/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kube-api-access-p67vr\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217083 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-modprobe-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-host\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a423aaa2-c387-4ccb-a9ca-a627b634154d-konnectivity-ca\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-os-release\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-kubelet\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217242 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-serviceca\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/185c47d9-cd61-422a-b3f0-0a6dd4148756-iptables-alerter-script\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-host\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.217794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-os-release\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217334 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-cni-binary-copy\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-kubelet\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-systemd-units\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-d\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-tmp\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-etc-kubernetes\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-systemd-units\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a432113-3c33-4a2e-970d-baa3beba7cc7-tmp-dir\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-os-release\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-systemd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-bin\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-daemon-config\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-config\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-script-lib\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.218544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-os-release\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-log-socket\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217707 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-etc-kubernetes\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a423aaa2-c387-4ccb-a9ca-a627b634154d-agent-certs\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-device-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-etc-selinux\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-cnibin\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-netns\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217894 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8a6a8c0-5642-4be5-90d3-2827312267c3-cni-binary-copy\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-socket-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.217958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.217989 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218003 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a432113-3c33-4a2e-970d-baa3beba7cc7-tmp-dir\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218005 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-device-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218059 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-etc-selinux\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.219376 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.218097 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:40.718030224 +0000 UTC m=+2.101693676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218114 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-cnibin\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-netns\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218162 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-socket-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-log-socket\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218204 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-ovn-kubernetes\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-systemd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218298 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-script-lib\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-bin\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218390 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218391 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/185c47d9-cd61-422a-b3f0-0a6dd4148756-iptables-alerter-script\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-registration-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-sys-fs\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-registration-dir\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77ab947-a0da-49c9-9e9a-c4d71f2de312-sys-fs\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-netns\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovnkube-config\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-run-netns\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6twzf\" (UniqueName: \"kubernetes.io/projected/b488ff1d-ff75-45f2-8473-6f48445e1b55-kube-api-access-6twzf\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-conf\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218844 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-sysctl-conf\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-system-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-multus\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218962 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-system-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218993 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-multus\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.218963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0189e11b-ff99-45bd-a5b2-5f0873332309-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-slash\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-slash\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-node-log\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-env-overrides\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.220968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219205 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-etc-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219231 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-node-log\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-conf-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-netd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjk7z\" (UniqueName: \"kubernetes.io/projected/185c47d9-cd61-422a-b3f0-0a6dd4148756-kube-api-access-wjk7z\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-kubernetes\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-cnibin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-hostroot\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219402 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-multus-certs\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxnh\" (UniqueName: \"kubernetes.io/projected/059c23ce-c3ea-4b83-a8a4-3c537435306e-kube-api-access-qdxnh\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-bin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219504 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-etc-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-systemd\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-var-lib-kubelet\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-systemd\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-k8s-cni-cncf-io\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b488ff1d-ff75-45f2-8473-6f48445e1b55-env-overrides\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsn7\" (UniqueName: \"kubernetes.io/projected/a8a6a8c0-5642-4be5-90d3-2827312267c3-kube-api-access-7vsn7\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219623 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-cnibin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-host-cni-netd\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-kubernetes\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-cni-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219823 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-multus-conf-dir\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219840 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-var-lib-cni-bin\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219870 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-hostroot\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-multus-certs\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7da2a734-1f23-41a0-a455-b5e1b7871e27-var-lib-kubelet\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219916 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8a6a8c0-5642-4be5-90d3-2827312267c3-host-run-k8s-cni-cncf-io\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.219963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0189e11b-ff99-45bd-a5b2-5f0873332309-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.220012 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b488ff1d-ff75-45f2-8473-6f48445e1b55-run-openvswitch\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.220253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-etc-tuned\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.220337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b488ff1d-ff75-45f2-8473-6f48445e1b55-ovn-node-metrics-cert\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.220615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a423aaa2-c387-4ccb-a9ca-a627b634154d-agent-certs\") pod \"konnectivity-agent-bdf4x\" (UID: \"a423aaa2-c387-4ccb-a9ca-a627b634154d\") " pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.221905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.220962 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da2a734-1f23-41a0-a455-b5e1b7871e27-tmp\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.224201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.224174 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gj4n\" (UniqueName: \"kubernetes.io/projected/9a432113-3c33-4a2e-970d-baa3beba7cc7-kube-api-access-7gj4n\") pod \"node-resolver-9pxxv\" (UID: \"9a432113-3c33-4a2e-970d-baa3beba7cc7\") " pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.224352 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.224338 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdmw\" (UniqueName: \"kubernetes.io/projected/7da2a734-1f23-41a0-a455-b5e1b7871e27-kube-api-access-dhdmw\") pod \"tuned-2p2bw\" (UID: \"7da2a734-1f23-41a0-a455-b5e1b7871e27\") " pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.224549 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.224532 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9sp\" (UniqueName: \"kubernetes.io/projected/797c1b6c-d66b-4ed4-9555-07a34d9d2f2a-kube-api-access-7n9sp\") pod \"node-ca-5zgqq\" (UID: \"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a\") " pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.224776 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.224760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tklw\" (UniqueName: \"kubernetes.io/projected/0189e11b-ff99-45bd-a5b2-5f0873332309-kube-api-access-5tklw\") pod \"multus-additional-cni-plugins-mq5wk\" (UID: \"0189e11b-ff99-45bd-a5b2-5f0873332309\") " pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.224889 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.224874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67vr\" (UniqueName: \"kubernetes.io/projected/b77ab947-a0da-49c9-9e9a-c4d71f2de312-kube-api-access-p67vr\") pod \"aws-ebs-csi-driver-node-6d2m7\" (UID: \"b77ab947-a0da-49c9-9e9a-c4d71f2de312\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.226663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.226640 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxnh\" (UniqueName: \"kubernetes.io/projected/059c23ce-c3ea-4b83-a8a4-3c537435306e-kube-api-access-qdxnh\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.227189 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.227171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6twzf\" (UniqueName: \"kubernetes.io/projected/b488ff1d-ff75-45f2-8473-6f48445e1b55-kube-api-access-6twzf\") pod \"ovnkube-node-fk24r\" (UID: \"b488ff1d-ff75-45f2-8473-6f48445e1b55\") " pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.227284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.227270 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsn7\" (UniqueName: \"kubernetes.io/projected/a8a6a8c0-5642-4be5-90d3-2827312267c3-kube-api-access-7vsn7\") pod \"multus-nth7f\" (UID: \"a8a6a8c0-5642-4be5-90d3-2827312267c3\") " pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.227558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.227543 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjk7z\" (UniqueName: \"kubernetes.io/projected/185c47d9-cd61-422a-b3f0-0a6dd4148756-kube-api-access-wjk7z\") pod \"iptables-alerter-l8nt5\" (UID: \"185c47d9-cd61-422a-b3f0-0a6dd4148756\") " pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.239353 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.239320 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" event={"ID":"ddd6b3915ea610317bcd527288295ec4","Type":"ContainerStarted","Data":"8a27e5b92bcd61811dfcd0c491d318a9731690ac1133d31e8fedb7fbdf193327"} Apr 16 18:10:40.240291 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.240272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" event={"ID":"3b61c38e01c9c0b2139dda9e09bff1f5","Type":"ContainerStarted","Data":"f72487960d70973934d8f7a75f8fe74157b432943b8a399bd27788332f80db65"} Apr 16 18:10:40.320645 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.320571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:40.325832 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.325816 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:40.325873 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.325836 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:40.325873 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.325845 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:40.325934 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.325892 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:40.82587868 +0000 UTC m=+2.209542129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:40.421633 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.421605 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8nt5" Apr 16 18:10:40.427798 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.427769 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod185c47d9_cd61_422a_b3f0_0a6dd4148756.slice/crio-a4c9426f7e1970f75d3bfbe3d545d21fdd067f78955d629cd291bae3818abe42 WatchSource:0}: Error finding container a4c9426f7e1970f75d3bfbe3d545d21fdd067f78955d629cd291bae3818abe42: Status 404 returned error can't find the container with id a4c9426f7e1970f75d3bfbe3d545d21fdd067f78955d629cd291bae3818abe42 Apr 16 18:10:40.439630 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.439607 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:10:40.446573 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.446551 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb488ff1d_ff75_45f2_8473_6f48445e1b55.slice/crio-2c85601b8e0d80800ce9dc15ef686b00fa11d48effde905be6d5f6ac94317299 WatchSource:0}: Error finding container 2c85601b8e0d80800ce9dc15ef686b00fa11d48effde905be6d5f6ac94317299: Status 404 returned error can't find the container with id 2c85601b8e0d80800ce9dc15ef686b00fa11d48effde905be6d5f6ac94317299 Apr 16 18:10:40.455901 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.455868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:10:40.462066 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.462028 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda423aaa2_c387_4ccb_a9ca_a627b634154d.slice/crio-dfb8c7f10b539ac0d6d5986bbfee75ba90952ee186c6b2c88c0ab1dc238899e5 WatchSource:0}: Error finding container dfb8c7f10b539ac0d6d5986bbfee75ba90952ee186c6b2c88c0ab1dc238899e5: Status 404 returned error can't find the container with id dfb8c7f10b539ac0d6d5986bbfee75ba90952ee186c6b2c88c0ab1dc238899e5 Apr 16 18:10:40.476202 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.476183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" Apr 16 18:10:40.481559 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.481536 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da2a734_1f23_41a0_a455_b5e1b7871e27.slice/crio-ca7d02cbd8f2f3c0ee7633b8d3a07a2958330b1552b0345d76a5ba883c8ed4fc WatchSource:0}: Error finding container ca7d02cbd8f2f3c0ee7633b8d3a07a2958330b1552b0345d76a5ba883c8ed4fc: Status 404 returned error can't find the container with id ca7d02cbd8f2f3c0ee7633b8d3a07a2958330b1552b0345d76a5ba883c8ed4fc Apr 16 18:10:40.482034 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.482016 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5zgqq" Apr 16 18:10:40.487472 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.487453 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797c1b6c_d66b_4ed4_9555_07a34d9d2f2a.slice/crio-7a41b895eb27d5da995e371c1f491475840bc5c6f965d550d827437f49b30709 WatchSource:0}: Error finding container 7a41b895eb27d5da995e371c1f491475840bc5c6f965d550d827437f49b30709: Status 404 returned error can't find the container with id 7a41b895eb27d5da995e371c1f491475840bc5c6f965d550d827437f49b30709 Apr 16 18:10:40.488083 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.488064 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nth7f" Apr 16 18:10:40.493525 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.493508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" Apr 16 18:10:40.493594 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.493554 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a6a8c0_5642_4be5_90d3_2827312267c3.slice/crio-9a8759909142bde4399c9412da834c0414bb8368f0c872af22ba8bd5457ca996 WatchSource:0}: Error finding container 9a8759909142bde4399c9412da834c0414bb8368f0c872af22ba8bd5457ca996: Status 404 returned error can't find the container with id 9a8759909142bde4399c9412da834c0414bb8368f0c872af22ba8bd5457ca996 Apr 16 18:10:40.499227 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.499211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9pxxv" Apr 16 18:10:40.499489 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.499470 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77ab947_a0da_49c9_9e9a_c4d71f2de312.slice/crio-7f8c425afd5341d8fd2ea6b871b60410b6b3be1835f8c01efae078d3bd672ad8 WatchSource:0}: Error finding container 7f8c425afd5341d8fd2ea6b871b60410b6b3be1835f8c01efae078d3bd672ad8: Status 404 returned error can't find the container with id 7f8c425afd5341d8fd2ea6b871b60410b6b3be1835f8c01efae078d3bd672ad8 Apr 16 18:10:40.504051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.504023 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" Apr 16 18:10:40.505555 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.505539 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a432113_3c33_4a2e_970d_baa3beba7cc7.slice/crio-08b1decfdb4038eb8b6556ff0da6932683dc81848908aa794f8e4193f6c89826 WatchSource:0}: Error finding container 08b1decfdb4038eb8b6556ff0da6932683dc81848908aa794f8e4193f6c89826: Status 404 returned error can't find the container with id 08b1decfdb4038eb8b6556ff0da6932683dc81848908aa794f8e4193f6c89826 Apr 16 18:10:40.511784 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:10:40.511764 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0189e11b_ff99_45bd_a5b2_5f0873332309.slice/crio-b9d331206266912cb721cef0e708dd092a1e25ae433d2ebfb6c0ae43a3bc563c WatchSource:0}: Error finding container b9d331206266912cb721cef0e708dd092a1e25ae433d2ebfb6c0ae43a3bc563c: Status 404 returned error can't find the container with id b9d331206266912cb721cef0e708dd092a1e25ae433d2ebfb6c0ae43a3bc563c Apr 16 18:10:40.722950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.722852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:40.723086 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.722986 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:40.723086 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.723067 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.723031212 +0000 UTC m=+3.106694667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:40.886777 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.886459 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:40.924630 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:40.924595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:40.924827 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.924808 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:40.924920 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.924838 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:40.924920 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.924851 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:40.924920 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:40.924910 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.924890488 +0000 UTC m=+3.308553940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:41.200404 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.200255 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:40 +0000 UTC" deadline="2028-01-28 05:59:22.37652387 +0000 UTC" Apr 16 18:10:41.200404 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.200305 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15635h48m41.176223197s" Apr 16 18:10:41.252253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.252218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" event={"ID":"b77ab947-a0da-49c9-9e9a-c4d71f2de312","Type":"ContainerStarted","Data":"7f8c425afd5341d8fd2ea6b871b60410b6b3be1835f8c01efae078d3bd672ad8"} Apr 16 18:10:41.272212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.272179 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bdf4x" event={"ID":"a423aaa2-c387-4ccb-a9ca-a627b634154d","Type":"ContainerStarted","Data":"dfb8c7f10b539ac0d6d5986bbfee75ba90952ee186c6b2c88c0ab1dc238899e5"} Apr 16 18:10:41.280439 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.280383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8nt5" event={"ID":"185c47d9-cd61-422a-b3f0-0a6dd4148756","Type":"ContainerStarted","Data":"a4c9426f7e1970f75d3bfbe3d545d21fdd067f78955d629cd291bae3818abe42"} Apr 16 18:10:41.289191 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.289156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nth7f" event={"ID":"a8a6a8c0-5642-4be5-90d3-2827312267c3","Type":"ContainerStarted","Data":"9a8759909142bde4399c9412da834c0414bb8368f0c872af22ba8bd5457ca996"} Apr 16 18:10:41.303957 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.303926 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5zgqq" event={"ID":"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a","Type":"ContainerStarted","Data":"7a41b895eb27d5da995e371c1f491475840bc5c6f965d550d827437f49b30709"} Apr 16 18:10:41.323999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.323950 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" event={"ID":"7da2a734-1f23-41a0-a455-b5e1b7871e27","Type":"ContainerStarted","Data":"ca7d02cbd8f2f3c0ee7633b8d3a07a2958330b1552b0345d76a5ba883c8ed4fc"} Apr 16 18:10:41.328941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.328814 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"2c85601b8e0d80800ce9dc15ef686b00fa11d48effde905be6d5f6ac94317299"} Apr 16 18:10:41.348680 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.348647 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerStarted","Data":"b9d331206266912cb721cef0e708dd092a1e25ae433d2ebfb6c0ae43a3bc563c"} Apr 16 18:10:41.364727 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.364696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9pxxv" event={"ID":"9a432113-3c33-4a2e-970d-baa3beba7cc7","Type":"ContainerStarted","Data":"08b1decfdb4038eb8b6556ff0da6932683dc81848908aa794f8e4193f6c89826"} Apr 16 18:10:41.515381 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.515302 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xx8pm"] Apr 16 18:10:41.521064 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.520786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.521240 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.521214 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:41.530063 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.529866 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.530063 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.529934 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-dbus\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.530063 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.529990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-kubelet-config\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.536915 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.536891 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:41.565799 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.565773 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:41.631204 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.631167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-kubelet-config\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.631373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.631229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.631373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.631293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-dbus\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.631489 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.631473 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-dbus\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.631563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.631546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-kubelet-config\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:41.631695 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.631677 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:41.631749 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.631742 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:42.131721607 +0000 UTC m=+3.515385059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:41.732336 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.732157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:41.732336 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.732299 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:41.732578 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.732363 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.732345253 +0000 UTC m=+5.116008716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:41.934430 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:41.934348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:41.934594 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.934502 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:41.934594 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.934520 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:41.934594 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.934532 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:41.934594 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:41.934591 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.934572814 +0000 UTC m=+5.318236279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:42.136058 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:42.136006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:42.136220 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:42.136153 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:42.136288 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:42.136221 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.136201981 +0000 UTC m=+4.519865442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:42.202796 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:42.201395 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:40 +0000 UTC" deadline="2027-11-10 17:54:45.859504076 +0000 UTC" Apr 16 18:10:42.202796 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:42.201434 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13751h44m3.658074372s" Apr 16 18:10:42.236861 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:42.236831 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:42.237029 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:42.236960 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:42.237363 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:42.237294 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:42.237478 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:42.237407 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:43.144652 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:43.144541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:43.144817 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.144755 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:43.144885 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.144819 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:45.144800376 +0000 UTC m=+6.528463828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:43.237078 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:43.237032 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:43.237519 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.237182 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:43.750683 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:43.750645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:43.750840 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.750793 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.750905 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.750859 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.750840619 +0000 UTC m=+9.134504071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.952407 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:43.952368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:43.952563 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.952522 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:43.952563 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.952538 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:43.952563 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.952552 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:43.952720 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:43.952618 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.952597139 +0000 UTC m=+9.336260591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:44.237364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:44.236786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:44.237364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:44.236837 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:44.237364 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:44.236931 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:44.237364 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:44.237090 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:45.163326 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:45.163286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:45.163525 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:45.163455 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:45.163599 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:45.163532 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:49.163513494 +0000 UTC m=+10.547176958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:45.237706 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:45.237675 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:45.238185 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:45.237808 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:46.237614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:46.237577 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:46.237614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:46.237619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:46.238255 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:46.237712 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:46.238255 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:46.237935 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:47.241360 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:47.240881 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:47.241360 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.241011 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:47.785288 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:47.785191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:47.785482 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.785360 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:47.785482 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.785436 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:10:55.785415802 +0000 UTC m=+17.169079273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:47.987295 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:47.986637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:47.987295 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.986823 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:47.987295 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.986841 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:47.987295 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.986855 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:47.987295 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:47.986917 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:55.98689768 +0000 UTC m=+17.370561150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:48.237375 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:48.237292 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:48.237546 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:48.237428 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:48.237546 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:48.237479 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:48.237661 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:48.237609 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:49.194798 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:49.194507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:49.194798 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:49.194639 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:49.194798 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:49.194702 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:10:57.19468365 +0000 UTC m=+18.578347113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:49.238105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:49.237975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:49.238105 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:49.238101 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:50.237312 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:50.237224 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:50.237717 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:50.237360 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:50.237717 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:50.237427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:50.237717 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:50.237535 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:51.237286 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:51.237251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:51.237439 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:51.237361 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:52.236892 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:52.236855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:52.237107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:52.236855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:52.237107 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:52.236978 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:52.237107 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:52.237053 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:53.237216 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:53.237182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:53.237676 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:53.237336 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:54.237262 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:54.237224 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:54.237262 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:54.237258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:54.237761 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:54.237373 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:54.237761 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:54.237507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:55.237020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:55.236989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:55.237219 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:55.237125 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:55.840302 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:55.840262 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:55.840796 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:55.840435 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:55.840796 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:55.840511 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:11.840494273 +0000 UTC m=+33.224157722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:56.041259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:56.041227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:56.041411 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.041359 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:56.041411 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.041375 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:56.041411 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.041386 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:56.041539 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.041447 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:12.041433035 +0000 UTC m=+33.425096484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:56.237542 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:56.237456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:56.237709 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:56.237464 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:56.237709 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.237599 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:56.237709 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:56.237680 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:57.236988 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:57.236959 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:57.237490 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:57.237095 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:57.249624 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:57.249600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:57.249759 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:57.249707 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:57.249814 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:57.249770 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret podName:3544bf39-6ebd-4771-b6fb-1c8d17fcabdb nodeName:}" failed. No retries permitted until 2026-04-16 18:11:13.249750387 +0000 UTC m=+34.633413847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret") pod "global-pull-secret-syncer-xx8pm" (UID: "3544bf39-6ebd-4771-b6fb-1c8d17fcabdb") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:58.239955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:58.237237 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:10:58.239955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:58.237264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:10:58.239955 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:58.237432 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:10:58.239955 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:58.237551 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:10:59.238867 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.238655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:10:59.238998 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:10:59.238891 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:10:59.404554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.404517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" event={"ID":"3b61c38e01c9c0b2139dda9e09bff1f5","Type":"ContainerStarted","Data":"4aa9e2433b7ca7470f16b99212ab05b11dcae19b0ea3349a88a39f1b9fe4a190"} Apr 16 18:10:59.406201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.406159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nth7f" event={"ID":"a8a6a8c0-5642-4be5-90d3-2827312267c3","Type":"ContainerStarted","Data":"bf83efc02ef17df74d53cf1a577a13c9f7bcd527cd58ff1c7d7e9661a23e8881"} Apr 16 18:10:59.407603 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.407578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5zgqq" event={"ID":"797c1b6c-d66b-4ed4-9555-07a34d9d2f2a","Type":"ContainerStarted","Data":"ff89f96a51088df7b6cf966a65a6bccbc623088e48f5fad30a7d337ec5f6ef1d"} Apr 16 18:10:59.409468 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.409431 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" event={"ID":"7da2a734-1f23-41a0-a455-b5e1b7871e27","Type":"ContainerStarted","Data":"48f7c16cd67a3386305c9c44b98a6152fff31429903d054305e3790b892805fe"} Apr 16 18:10:59.412392 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412366 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"e34b24f18b155eac59121f4355edf6b6b6b15da8622f30d07dfe51f7ccbdb87c"} Apr 16 18:10:59.412502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"a2648f95d7507799c80d483d5315f171cc2756f97cb5e3668dcb9cd06c19a95e"} Apr 16 18:10:59.412502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412416 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"e3d5eab7de8fea312c9b1cc25551cd8f6173b8e49b9ea2bbf9186063f96c8f63"} Apr 16 18:10:59.412502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412431 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"831fe5e2e0ee361af477199525971b3ed99cdbad61ae949cec63d6011ab02927"} Apr 16 18:10:59.412502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"bf7be822ac0780cb1d8f6125e814f2856f91bb8403947dee4705faa5862ed90a"} Apr 16 18:10:59.412502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.412456 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"010794a7aa2c6ddeb18ee0e2b468e18de828fc877a418aca45d863dc64557200"} Apr 16 18:10:59.413849 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.413706 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerStarted","Data":"6fcbac1d79406ec9b3458b7c66ecc55ad6c8d06b927dcc546073543fa5778dc2"} Apr 16 18:10:59.415245 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.415223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9pxxv" event={"ID":"9a432113-3c33-4a2e-970d-baa3beba7cc7","Type":"ContainerStarted","Data":"47339dcd38c7938e1929ae27c0b2c2fd0c6de0ee08e9ce4805f695d68a233783"} Apr 16 18:10:59.416731 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.416707 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" event={"ID":"b77ab947-a0da-49c9-9e9a-c4d71f2de312","Type":"ContainerStarted","Data":"4fb4ba315c4b4b9f836e9e78095a064a900aed1dbc55cf51d3c1fde2e5273799"} Apr 16 18:10:59.418176 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.418152 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bdf4x" event={"ID":"a423aaa2-c387-4ccb-a9ca-a627b634154d","Type":"ContainerStarted","Data":"067122c8d95750dea43beca9a62fba9bd8032686ddd0500bc88c4b8cab3071d3"} Apr 16 18:10:59.418890 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.418852 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-192.ec2.internal" podStartSLOduration=20.418840482 podStartE2EDuration="20.418840482s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:59.418485304 +0000 UTC m=+20.802148775" watchObservedRunningTime="2026-04-16 18:10:59.418840482 +0000 UTC m=+20.802503958" Apr 16 18:10:59.421884 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.421861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" event={"ID":"ddd6b3915ea610317bcd527288295ec4","Type":"ContainerStarted","Data":"6295057a9c67d18fd9622159c5f295d1b6ca0e302a092393e6cac498c5103698"} Apr 16 18:10:59.433151 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.433109 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bdf4x" podStartSLOduration=2.7107582089999998 podStartE2EDuration="20.433096076s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.463328868 +0000 UTC m=+1.846992320" lastFinishedPulling="2026-04-16 18:10:58.185666736 +0000 UTC m=+19.569330187" observedRunningTime="2026-04-16 18:10:59.432753619 +0000 UTC m=+20.816417093" watchObservedRunningTime="2026-04-16 18:10:59.433096076 +0000 UTC m=+20.816759547" Apr 16 18:10:59.449573 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.449526 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2p2bw" podStartSLOduration=3.15612643 podStartE2EDuration="20.449510219s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.482925201 +0000 UTC m=+1.866588650" lastFinishedPulling="2026-04-16 18:10:57.77630898 +0000 UTC m=+19.159972439" observedRunningTime="2026-04-16 18:10:59.449160552 +0000 UTC m=+20.832824023" watchObservedRunningTime="2026-04-16 18:10:59.449510219 +0000 UTC m=+20.833173691" Apr 16 18:10:59.462795 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.462757 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5zgqq" podStartSLOduration=2.716505484 podStartE2EDuration="20.462740852s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.488920141 +0000 UTC m=+1.872583594" lastFinishedPulling="2026-04-16 18:10:58.235155508 +0000 UTC m=+19.618818962" observedRunningTime="2026-04-16 18:10:59.462159437 +0000 UTC m=+20.845822909" watchObservedRunningTime="2026-04-16 18:10:59.462740852 +0000 UTC m=+20.846404329" Apr 16 18:10:59.478560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.478517 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nth7f" podStartSLOduration=2.725935297 podStartE2EDuration="20.478502057s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.494821705 +0000 UTC m=+1.878485154" lastFinishedPulling="2026-04-16 18:10:58.247388466 +0000 UTC m=+19.631051914" observedRunningTime="2026-04-16 18:10:59.478367768 +0000 UTC m=+20.862031252" watchObservedRunningTime="2026-04-16 18:10:59.478502057 +0000 UTC m=+20.862165530" Apr 16 18:10:59.495137 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:10:59.495002 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9pxxv" podStartSLOduration=2.816806673 podStartE2EDuration="20.494987454s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.507575733 +0000 UTC m=+1.891239182" lastFinishedPulling="2026-04-16 18:10:58.185756443 +0000 UTC m=+19.569419963" observedRunningTime="2026-04-16 18:10:59.494513677 +0000 UTC m=+20.878177149" watchObservedRunningTime="2026-04-16 18:10:59.494987454 +0000 UTC m=+20.878650927" Apr 16 18:11:00.236987 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.236953 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:00.237192 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:00.237158 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:00.237270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.236953 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:00.237318 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:00.237286 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:00.291332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.291310 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:11:00.425153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.425117 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="6fcbac1d79406ec9b3458b7c66ecc55ad6c8d06b927dcc546073543fa5778dc2" exitCode=0 Apr 16 18:11:00.425587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.425205 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"6fcbac1d79406ec9b3458b7c66ecc55ad6c8d06b927dcc546073543fa5778dc2"} Apr 16 18:11:00.426884 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.426860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" event={"ID":"b77ab947-a0da-49c9-9e9a-c4d71f2de312","Type":"ContainerStarted","Data":"6aa40fcb07d0a7f3b90a24a52183b4a24ae680c3294bb2ab00a13774e9f18540"} Apr 16 18:11:00.428057 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.428023 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8nt5" event={"ID":"185c47d9-cd61-422a-b3f0-0a6dd4148756","Type":"ContainerStarted","Data":"cad2cfd835fee4bfce986c093d3313ae1ae05910992a1685722987f3321b6447"} Apr 16 18:11:00.429548 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.429528 2567 generic.go:358] "Generic (PLEG): container finished" podID="ddd6b3915ea610317bcd527288295ec4" containerID="6295057a9c67d18fd9622159c5f295d1b6ca0e302a092393e6cac498c5103698" exitCode=0 Apr 16 18:11:00.429660 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.429628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" event={"ID":"ddd6b3915ea610317bcd527288295ec4","Type":"ContainerDied","Data":"6295057a9c67d18fd9622159c5f295d1b6ca0e302a092393e6cac498c5103698"} Apr 16 18:11:00.429714 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.429679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" event={"ID":"ddd6b3915ea610317bcd527288295ec4","Type":"ContainerStarted","Data":"6b1e747bf1d0b9b7c1ccd997edec83911d2cadb34ff695f9efa389891dca1fc5"} Apr 16 18:11:00.461507 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.461474 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:11:00.462067 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.462032 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:11:00.474979 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.474940 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l8nt5" podStartSLOduration=3.667891145 podStartE2EDuration="21.474927163s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.429500299 +0000 UTC m=+1.813163749" lastFinishedPulling="2026-04-16 18:10:58.236536307 +0000 UTC m=+19.620199767" observedRunningTime="2026-04-16 18:11:00.474686403 +0000 UTC m=+21.858349874" watchObservedRunningTime="2026-04-16 18:11:00.474927163 +0000 UTC m=+21.858590630" Apr 16 18:11:00.475259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:00.475239 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-192.ec2.internal" podStartSLOduration=21.475232705 podStartE2EDuration="21.475232705s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:00.461630041 +0000 UTC m=+21.845293511" watchObservedRunningTime="2026-04-16 18:11:00.475232705 +0000 UTC m=+21.858896187" Apr 16 18:11:01.190966 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.190856 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:11:00.291327195Z","UUID":"0387aa11-ffe7-4e54-a9e4-0de4661b43a1","Handler":null,"Name":"","Endpoint":""} Apr 16 18:11:01.194959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.194932 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:11:01.194959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.194962 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:11:01.237339 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.237270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:01.237515 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:01.237408 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:01.434740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.434652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"e0b37ef90c41a50ff43692e4063a5b4fae23365bbcdcaf1e4bc970ab7dad2c5c"} Apr 16 18:11:01.436856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.436810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" event={"ID":"b77ab947-a0da-49c9-9e9a-c4d71f2de312","Type":"ContainerStarted","Data":"1fdbfcdf2c2c60a2a69af3c5ba06a8b3a1ac77a2fd308fa97ca63d8f718dd927"} Apr 16 18:11:01.437483 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.437459 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:11:01.437983 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.437962 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bdf4x" Apr 16 18:11:01.455246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:01.455203 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6d2m7" podStartSLOduration=1.800644468 podStartE2EDuration="22.455189624s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.502213874 +0000 UTC m=+1.885877338" lastFinishedPulling="2026-04-16 18:11:01.156759043 +0000 UTC m=+22.540422494" observedRunningTime="2026-04-16 18:11:01.455096748 +0000 UTC m=+22.838760220" watchObservedRunningTime="2026-04-16 18:11:01.455189624 +0000 UTC m=+22.838853098" Apr 16 18:11:02.237020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:02.236991 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:02.237196 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:02.237110 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:02.237235 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:02.237196 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:02.237305 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:02.237287 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:03.237330 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:03.237150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:03.237732 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:03.237487 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:04.236718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:04.236687 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:04.236884 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:04.236693 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:04.236884 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:04.236803 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:04.236884 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:04.236872 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:05.237748 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.237532 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:05.238411 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:05.237789 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:05.447335 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.447291 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" event={"ID":"b488ff1d-ff75-45f2-8473-6f48445e1b55","Type":"ContainerStarted","Data":"ea8cfdcb7e69a38a64a6f462ba5f6b724350cdd3998e03b6f9ea233b1810d06b"} Apr 16 18:11:05.447623 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.447600 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:05.449012 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.448992 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="534ce68d08deb2a78b6c9e949895b7604b20411690451e77d0c8181bfd9989c0" exitCode=0 Apr 16 18:11:05.449139 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.449024 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"534ce68d08deb2a78b6c9e949895b7604b20411690451e77d0c8181bfd9989c0"} Apr 16 18:11:05.462354 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.462335 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:05.475820 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:05.475784 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" podStartSLOduration=8.246962093 podStartE2EDuration="26.475773133s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.447984614 +0000 UTC m=+1.831648063" lastFinishedPulling="2026-04-16 18:10:58.676795648 +0000 UTC m=+20.060459103" observedRunningTime="2026-04-16 18:11:05.474518351 +0000 UTC m=+26.858181822" watchObservedRunningTime="2026-04-16 18:11:05.475773133 +0000 UTC m=+26.859436596" Apr 16 18:11:06.237684 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.237657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:06.237863 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.237661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:06.237863 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:06.237788 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:06.238238 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:06.237910 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:06.421059 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.421021 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9tn4q"] Apr 16 18:11:06.423870 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.423845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xx8pm"] Apr 16 18:11:06.423987 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.423943 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:06.424103 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:06.424079 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:06.424441 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.424418 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8w622"] Apr 16 18:11:06.452755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.452736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:06.452872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.452794 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerStarted","Data":"f7130d99c8516973e3eae112df40e50d8eee1872e2050097ac1cae5f40619be7"} Apr 16 18:11:06.452872 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:06.452843 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:06.453021 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.453007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:06.453144 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:06.453124 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:06.453244 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.453229 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:11:06.453627 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.453595 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:06.472000 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.471937 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:06.953658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:06.953631 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:07.456125 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:07.456097 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="f7130d99c8516973e3eae112df40e50d8eee1872e2050097ac1cae5f40619be7" exitCode=0 Apr 16 18:11:07.456528 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:07.456186 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"f7130d99c8516973e3eae112df40e50d8eee1872e2050097ac1cae5f40619be7"} Apr 16 18:11:08.237724 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:08.237529 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:08.237826 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:08.237532 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:08.237826 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:08.237766 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:08.237928 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:08.237540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:08.237972 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:08.237875 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:08.238011 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:08.237956 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:08.461110 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:08.461077 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="6ddf3a5a996c4e28685d18b2a9280be7547a04d1e2a7f71945f35616e7022185" exitCode=0 Apr 16 18:11:08.461560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:08.461161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"6ddf3a5a996c4e28685d18b2a9280be7547a04d1e2a7f71945f35616e7022185"} Apr 16 18:11:10.237492 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:10.237460 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:10.237492 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:10.237480 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:10.238166 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:10.237483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:10.238166 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:10.237582 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xx8pm" podUID="3544bf39-6ebd-4771-b6fb-1c8d17fcabdb" Apr 16 18:11:10.238166 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:10.237651 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tn4q" podUID="30b4fba5-7d19-4433-b373-76fe14544828" Apr 16 18:11:10.238166 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:10.237702 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8w622" podUID="059c23ce-c3ea-4b83-a8a4-3c537435306e" Apr 16 18:11:11.446161 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.446100 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-192.ec2.internal" event="NodeReady" Apr 16 18:11:11.446616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.446295 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:11:11.495164 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.492680 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wcq4f"] Apr 16 18:11:11.511404 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.511376 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dmjvg"] Apr 16 18:11:11.511584 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.511560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.514586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.514565 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:11:11.514711 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.514635 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:11:11.514711 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.514642 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:11:11.537369 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.537349 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcq4f"] Apr 16 18:11:11.537514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.537378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dmjvg"] Apr 16 18:11:11.537514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.537497 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.540432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.540411 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:11:11.541428 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.541412 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:11:11.541908 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.541892 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:11:11.542220 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.542200 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:11:11.666910 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.666873 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.666910 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.666913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4j7\" (UniqueName: \"kubernetes.io/projected/199e8adf-4b79-4ccd-a556-c3d828499a76-kube-api-access-lt4j7\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.667115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.666945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b2f5246-4971-4502-a370-da270e3fd3dc-config-volume\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.667115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.666970 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.667115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.667079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmgx\" (UniqueName: \"kubernetes.io/projected/6b2f5246-4971-4502-a370-da270e3fd3dc-kube-api-access-gvmgx\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.667115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.667109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b2f5246-4971-4502-a370-da270e3fd3dc-tmp-dir\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768159 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmgx\" (UniqueName: \"kubernetes.io/projected/6b2f5246-4971-4502-a370-da270e3fd3dc-kube-api-access-gvmgx\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768159 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b2f5246-4971-4502-a370-da270e3fd3dc-tmp-dir\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768363 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.768363 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4j7\" (UniqueName: \"kubernetes.io/projected/199e8adf-4b79-4ccd-a556-c3d828499a76-kube-api-access-lt4j7\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.768363 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768334 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b2f5246-4971-4502-a370-da270e3fd3dc-config-volume\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768363 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768527 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.768417 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:11.768527 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.768423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b2f5246-4971-4502-a370-da270e3fd3dc-tmp-dir\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.768527 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.768440 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:11.768527 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.768505 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:12.268486129 +0000 UTC m=+33.652149592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:11.768527 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.768519 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:12.268513206 +0000 UTC m=+33.652176655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:11.779436 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.779400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b2f5246-4971-4502-a370-da270e3fd3dc-config-volume\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.779641 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.779622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmgx\" (UniqueName: \"kubernetes.io/projected/6b2f5246-4971-4502-a370-da270e3fd3dc-kube-api-access-gvmgx\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:11.779692 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.779637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4j7\" (UniqueName: \"kubernetes.io/projected/199e8adf-4b79-4ccd-a556-c3d828499a76-kube-api-access-lt4j7\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:11.869062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:11.869011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:11.869230 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.869137 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:11:11.869230 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:11.869203 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.869186967 +0000 UTC m=+65.252850420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:11:12.070528 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.070437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:12.070669 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.070604 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:11:12.070669 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.070631 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:11:12.070669 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.070644 2567 projected.go:194] Error preparing data for projected volume kube-api-access-v8xjp for pod openshift-network-diagnostics/network-check-target-9tn4q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:11:12.070788 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.070703 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp podName:30b4fba5-7d19-4433-b373-76fe14544828 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:44.070685789 +0000 UTC m=+65.454349238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-v8xjp" (UniqueName: "kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp") pod "network-check-target-9tn4q" (UID: "30b4fba5-7d19-4433-b373-76fe14544828") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:11:12.237242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.237203 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:12.237242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.237228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:12.237479 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.237330 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:12.240304 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.240278 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:11:12.241324 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.241298 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:12.241324 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.241316 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:12.241573 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.241321 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:11:12.241573 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.241454 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:11:12.241573 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.241515 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:12.272627 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.272601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:12.272722 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:12.272658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:12.272766 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.272740 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:12.272766 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.272758 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:12.272844 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.272794 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:13.272780638 +0000 UTC m=+34.656444086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:12.272844 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:12.272816 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:13.272798975 +0000 UTC m=+34.656462432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:13.279498 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:13.279456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:13.279566 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:13.279610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:13.279712 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:13.279724 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:13.279790 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:15.279769263 +0000 UTC m=+36.663432713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:13.280212 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:13.279814 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:15.279802538 +0000 UTC m=+36.663465990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:13.282094 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:13.282074 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3544bf39-6ebd-4771-b6fb-1c8d17fcabdb-original-pull-secret\") pod \"global-pull-secret-syncer-xx8pm\" (UID: \"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb\") " pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:13.449930 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:13.449888 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xx8pm" Apr 16 18:11:14.092381 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:14.092153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xx8pm"] Apr 16 18:11:14.194357 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:14.194315 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3544bf39_6ebd_4771_b6fb_1c8d17fcabdb.slice/crio-a25335ac2cf4b7c2a7c9634bdeb3fdc59369c4a11b43344c871833d6101540e8 WatchSource:0}: Error finding container a25335ac2cf4b7c2a7c9634bdeb3fdc59369c4a11b43344c871833d6101540e8: Status 404 returned error can't find the container with id a25335ac2cf4b7c2a7c9634bdeb3fdc59369c4a11b43344c871833d6101540e8 Apr 16 18:11:14.474619 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:14.474507 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xx8pm" event={"ID":"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb","Type":"ContainerStarted","Data":"a25335ac2cf4b7c2a7c9634bdeb3fdc59369c4a11b43344c871833d6101540e8"} Apr 16 18:11:14.477605 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:14.477576 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerStarted","Data":"c886f7342716849d0a5bf0e54606d75478d00762b6f9f9b741bf9cab77b4bbda"} Apr 16 18:11:15.297199 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:15.297168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:15.297377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:15.297216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:15.297377 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:15.297327 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:15.297553 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:15.297397 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:19.297371921 +0000 UTC m=+40.681035377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:15.297553 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:15.297327 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:15.297553 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:15.297476 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:19.297458369 +0000 UTC m=+40.681121818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:15.482600 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:15.482564 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="c886f7342716849d0a5bf0e54606d75478d00762b6f9f9b741bf9cab77b4bbda" exitCode=0 Apr 16 18:11:15.483090 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:15.482654 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"c886f7342716849d0a5bf0e54606d75478d00762b6f9f9b741bf9cab77b4bbda"} Apr 16 18:11:16.487263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:16.487227 2567 generic.go:358] "Generic (PLEG): container finished" podID="0189e11b-ff99-45bd-a5b2-5f0873332309" containerID="8d1b5ff57363bec0bf0f628ae070e9d1fff07e2310da3ab23fda02f02122f400" exitCode=0 Apr 16 18:11:16.487625 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:16.487306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerDied","Data":"8d1b5ff57363bec0bf0f628ae070e9d1fff07e2310da3ab23fda02f02122f400"} Apr 16 18:11:18.498056 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:18.498017 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" event={"ID":"0189e11b-ff99-45bd-a5b2-5f0873332309","Type":"ContainerStarted","Data":"1100510ee2fbdcfc1271bf9302ed3881d5baca6ad30373de7a1a2e3f04116638"} Apr 16 18:11:18.499278 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:18.499248 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xx8pm" event={"ID":"3544bf39-6ebd-4771-b6fb-1c8d17fcabdb","Type":"ContainerStarted","Data":"bd3841eb1575cf997db41b5bd2c796b6cfc77aefa3fa92aa0c918e75ef739f8e"} Apr 16 18:11:18.524212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:18.524157 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mq5wk" podStartSLOduration=5.798481903 podStartE2EDuration="39.524139787s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:10:40.513302661 +0000 UTC m=+1.896966114" lastFinishedPulling="2026-04-16 18:11:14.23896055 +0000 UTC m=+35.622623998" observedRunningTime="2026-04-16 18:11:18.523608243 +0000 UTC m=+39.907271715" watchObservedRunningTime="2026-04-16 18:11:18.524139787 +0000 UTC m=+39.907803259" Apr 16 18:11:18.539980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:18.539869 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xx8pm" podStartSLOduration=33.465659783 podStartE2EDuration="37.539724449s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:14.216850906 +0000 UTC m=+35.600514356" lastFinishedPulling="2026-04-16 18:11:18.290915563 +0000 UTC m=+39.674579022" observedRunningTime="2026-04-16 18:11:18.539583848 +0000 UTC m=+39.923247320" watchObservedRunningTime="2026-04-16 18:11:18.539724449 +0000 UTC m=+39.923387921" Apr 16 18:11:19.333004 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:19.332959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:19.333174 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:19.333013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:19.333174 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:19.333134 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:19.333257 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:19.333195 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:27.33317852 +0000 UTC m=+48.716841972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:19.333257 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:19.333134 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:19.333257 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:19.333233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:27.333224296 +0000 UTC m=+48.716887751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:27.389094 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:27.389031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:27.389573 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:27.389105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:27.389573 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:27.389181 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:27.389573 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:27.389241 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.389225863 +0000 UTC m=+64.772889312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:27.389573 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:27.389186 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:27.389573 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:27.389316 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.38930258 +0000 UTC m=+64.772966034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:38.471873 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:38.471847 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fk24r" Apr 16 18:11:41.024681 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.024646 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh"] Apr 16 18:11:41.028764 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.028744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" Apr 16 18:11:41.031409 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.031388 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bjn9t\"" Apr 16 18:11:41.031500 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.031407 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.031500 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.031394 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.036317 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.036297 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh"] Apr 16 18:11:41.129199 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.129168 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5c668c7d56-vrfwm"] Apr 16 18:11:41.132146 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.132126 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.135533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.135515 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:11:41.135643 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.135535 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:11:41.136123 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.136101 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:11:41.136248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.136151 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.136248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.136167 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.136978 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.136962 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-srr9z\"" Apr 16 18:11:41.137085 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.137075 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:11:41.145776 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.145755 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c668c7d56-vrfwm"] Apr 16 18:11:41.186676 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.186636 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ddg\" (UniqueName: \"kubernetes.io/projected/ddccaae8-9c6a-4a8c-8840-86e365b52d47-kube-api-access-h5ddg\") pod \"volume-data-source-validator-7d955d5dd4-6sggh\" (UID: \"ddccaae8-9c6a-4a8c-8840-86e365b52d47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" Apr 16 18:11:41.230478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.230448 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn"] Apr 16 18:11:41.233108 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.233093 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn"] Apr 16 18:11:41.233246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.233229 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.235809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.235790 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hwrv5\"" Apr 16 18:11:41.235896 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.235819 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.235935 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.235906 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.236185 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.236166 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:11:41.236185 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.236179 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:11:41.236412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.236399 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.240275 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.240256 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.241265 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.241247 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:11:41.241534 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.241518 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:11:41.241749 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.241734 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.242367 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.242351 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pgtsg\"" Apr 16 18:11:41.247974 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.247957 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn"] Apr 16 18:11:41.250516 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.250499 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn"] Apr 16 18:11:41.287587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287523 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-default-certificate\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.287587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.287587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkbm\" (UniqueName: \"kubernetes.io/projected/6347d10a-8636-4345-ac41-33e915aa23d9-kube-api-access-2kkbm\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.287780 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ddg\" (UniqueName: \"kubernetes.io/projected/ddccaae8-9c6a-4a8c-8840-86e365b52d47-kube-api-access-h5ddg\") pod \"volume-data-source-validator-7d955d5dd4-6sggh\" (UID: \"ddccaae8-9c6a-4a8c-8840-86e365b52d47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" Apr 16 18:11:41.287780 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-stats-auth\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.287780 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.287695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.296590 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.296563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ddg\" (UniqueName: \"kubernetes.io/projected/ddccaae8-9c6a-4a8c-8840-86e365b52d47-kube-api-access-h5ddg\") pod \"volume-data-source-validator-7d955d5dd4-6sggh\" (UID: \"ddccaae8-9c6a-4a8c-8840-86e365b52d47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" Apr 16 18:11:41.337403 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.337370 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" Apr 16 18:11:41.389432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.389393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csqnv\" (UniqueName: \"kubernetes.io/projected/1c2c8397-6713-4ddc-ab52-af57d514d8c2-kube-api-access-csqnv\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.389638 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.389620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4b6825-b8eb-410a-94cd-b638a19c37c4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.389738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.389724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-stats-auth\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.389842 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.389830 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.389935 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.389924 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2c8397-6713-4ddc-ab52-af57d514d8c2-config\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.390065 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.390035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2c8397-6713-4ddc-ab52-af57d514d8c2-serving-cert\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.390989 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.390959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76tn\" (UniqueName: \"kubernetes.io/projected/2c4b6825-b8eb-410a-94cd-b638a19c37c4-kube-api-access-m76tn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.391558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.391539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c4b6825-b8eb-410a-94cd-b638a19c37c4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.391720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.391707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-default-certificate\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.391809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.391798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.391893 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.391882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkbm\" (UniqueName: \"kubernetes.io/projected/6347d10a-8636-4345-ac41-33e915aa23d9-kube-api-access-2kkbm\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.392377 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.392359 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:41.392538 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.392528 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:41.892507059 +0000 UTC m=+63.276170513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : secret "router-metrics-certs-default" not found Apr 16 18:11:41.393329 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.393189 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:41.893172401 +0000 UTC m=+63.276835868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:41.398630 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.398581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-stats-auth\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.401980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.401517 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-default-certificate\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.409646 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.409600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkbm\" (UniqueName: \"kubernetes.io/projected/6347d10a-8636-4345-ac41-33e915aa23d9-kube-api-access-2kkbm\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.466076 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.466033 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh"] Apr 16 18:11:41.468853 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:41.468829 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddccaae8_9c6a_4a8c_8840_86e365b52d47.slice/crio-c396b4cca465b7a6519df0071570c06eddf24cf3c500fd25dc07a4b1a21fda18 WatchSource:0}: Error finding container c396b4cca465b7a6519df0071570c06eddf24cf3c500fd25dc07a4b1a21fda18: Status 404 returned error can't find the container with id c396b4cca465b7a6519df0071570c06eddf24cf3c500fd25dc07a4b1a21fda18 Apr 16 18:11:41.492655 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c4b6825-b8eb-410a-94cd-b638a19c37c4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.492738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqnv\" (UniqueName: \"kubernetes.io/projected/1c2c8397-6713-4ddc-ab52-af57d514d8c2-kube-api-access-csqnv\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.492738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4b6825-b8eb-410a-94cd-b638a19c37c4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.492816 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2c8397-6713-4ddc-ab52-af57d514d8c2-config\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.492816 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2c8397-6713-4ddc-ab52-af57d514d8c2-serving-cert\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.492996 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.492975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m76tn\" (UniqueName: \"kubernetes.io/projected/2c4b6825-b8eb-410a-94cd-b638a19c37c4-kube-api-access-m76tn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.493269 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.493251 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4b6825-b8eb-410a-94cd-b638a19c37c4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.494453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.494432 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2c8397-6713-4ddc-ab52-af57d514d8c2-config\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.494847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.494832 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c4b6825-b8eb-410a-94cd-b638a19c37c4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.495891 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.495875 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2c8397-6713-4ddc-ab52-af57d514d8c2-serving-cert\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.500649 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.500631 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76tn\" (UniqueName: \"kubernetes.io/projected/2c4b6825-b8eb-410a-94cd-b638a19c37c4-kube-api-access-m76tn\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6xrcn\" (UID: \"2c4b6825-b8eb-410a-94cd-b638a19c37c4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.501781 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.501766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csqnv\" (UniqueName: \"kubernetes.io/projected/1c2c8397-6713-4ddc-ab52-af57d514d8c2-kube-api-access-csqnv\") pod \"service-ca-operator-69965bb79d-87xvn\" (UID: \"1c2c8397-6713-4ddc-ab52-af57d514d8c2\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.541133 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.541075 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" event={"ID":"ddccaae8-9c6a-4a8c-8840-86e365b52d47","Type":"ContainerStarted","Data":"c396b4cca465b7a6519df0071570c06eddf24cf3c500fd25dc07a4b1a21fda18"} Apr 16 18:11:41.543265 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.543243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" Apr 16 18:11:41.547769 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.547752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" Apr 16 18:11:41.665232 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.665203 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn"] Apr 16 18:11:41.668080 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:41.668052 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4b6825_b8eb_410a_94cd_b638a19c37c4.slice/crio-b63afdf7b317dad28198eac0cbd99a61cec4dcc3e10e899086678b92b87a0b91 WatchSource:0}: Error finding container b63afdf7b317dad28198eac0cbd99a61cec4dcc3e10e899086678b92b87a0b91: Status 404 returned error can't find the container with id b63afdf7b317dad28198eac0cbd99a61cec4dcc3e10e899086678b92b87a0b91 Apr 16 18:11:41.676513 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.676491 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn"] Apr 16 18:11:41.679584 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:41.679561 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2c8397_6713_4ddc_ab52_af57d514d8c2.slice/crio-2cd791a54677bf3baee2e885b6a2433018709c7f7a44669db1c35450f083af51 WatchSource:0}: Error finding container 2cd791a54677bf3baee2e885b6a2433018709c7f7a44669db1c35450f083af51: Status 404 returned error can't find the container with id 2cd791a54677bf3baee2e885b6a2433018709c7f7a44669db1c35450f083af51 Apr 16 18:11:41.896327 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.896238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.896462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:41.896327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:41.896462 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.896420 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:42.896401588 +0000 UTC m=+64.280065058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:41.896462 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.896434 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:41.896563 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:41.896469 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:42.896458204 +0000 UTC m=+64.280121652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : secret "router-metrics-certs-default" not found Apr 16 18:11:42.545186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:42.545122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" event={"ID":"2c4b6825-b8eb-410a-94cd-b638a19c37c4","Type":"ContainerStarted","Data":"b63afdf7b317dad28198eac0cbd99a61cec4dcc3e10e899086678b92b87a0b91"} Apr 16 18:11:42.547080 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:42.547033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" event={"ID":"1c2c8397-6713-4ddc-ab52-af57d514d8c2","Type":"ContainerStarted","Data":"2cd791a54677bf3baee2e885b6a2433018709c7f7a44669db1c35450f083af51"} Apr 16 18:11:42.903325 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:42.903249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:42.903456 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:42.903341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:42.903456 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:42.903422 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:44.903403932 +0000 UTC m=+66.287067611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:42.903456 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:42.903448 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:42.903569 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:42.903487 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:44.903476533 +0000 UTC m=+66.287139982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : secret "router-metrics-certs-default" not found Apr 16 18:11:43.406838 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:43.406795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:11:43.407025 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:43.406927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:11:43.407025 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.406968 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:43.407188 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.407029 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:43.407188 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.407065 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls podName:6b2f5246-4971-4502-a370-da270e3fd3dc nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.40702359 +0000 UTC m=+96.790687051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls") pod "dns-default-wcq4f" (UID: "6b2f5246-4971-4502-a370-da270e3fd3dc") : secret "dns-default-metrics-tls" not found Apr 16 18:11:43.407188 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.407096 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert podName:199e8adf-4b79-4ccd-a556-c3d828499a76 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.407079432 +0000 UTC m=+96.790742886 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert") pod "ingress-canary-dmjvg" (UID: "199e8adf-4b79-4ccd-a556-c3d828499a76") : secret "canary-serving-cert" not found Apr 16 18:11:43.910057 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:43.910010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:11:43.912866 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:43.912838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:43.920550 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.920525 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:43.920663 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:43.920604 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs podName:059c23ce-c3ea-4b83-a8a4-3c537435306e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:47.920583251 +0000 UTC m=+129.304246705 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs") pod "network-metrics-daemon-8w622" (UID: "059c23ce-c3ea-4b83-a8a4-3c537435306e") : secret "metrics-daemon-secret" not found Apr 16 18:11:44.111609 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.111568 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:44.114418 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.114400 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:44.126497 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.126474 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:44.135835 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.135804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xjp\" (UniqueName: \"kubernetes.io/projected/30b4fba5-7d19-4433-b373-76fe14544828-kube-api-access-v8xjp\") pod \"network-check-target-9tn4q\" (UID: \"30b4fba5-7d19-4433-b373-76fe14544828\") " pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:44.364370 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.364343 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:11:44.372405 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.372390 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:44.626326 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.626298 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9tn4q"] Apr 16 18:11:44.630712 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:44.630685 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30b4fba5_7d19_4433_b373_76fe14544828.slice/crio-b17872c79f7af38a7781da9811d0046ad97b6aa39d39d4b4f1d9961cb803a043 WatchSource:0}: Error finding container b17872c79f7af38a7781da9811d0046ad97b6aa39d39d4b4f1d9961cb803a043: Status 404 returned error can't find the container with id b17872c79f7af38a7781da9811d0046ad97b6aa39d39d4b4f1d9961cb803a043 Apr 16 18:11:44.916443 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.916343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:44.916885 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:44.916455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:44.916885 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:44.916499 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:48.91648122 +0000 UTC m=+70.300144673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:44.916885 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:44.916569 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:44.916885 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:44.916623 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:48.916607636 +0000 UTC m=+70.300271091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : secret "router-metrics-certs-default" not found Apr 16 18:11:45.556300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.556244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" event={"ID":"ddccaae8-9c6a-4a8c-8840-86e365b52d47","Type":"ContainerStarted","Data":"e1d5c23524bebc018f9326b6adf3c3d6a4919910e502609a38dba3be913c9e78"} Apr 16 18:11:45.557784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.557726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" event={"ID":"2c4b6825-b8eb-410a-94cd-b638a19c37c4","Type":"ContainerStarted","Data":"53a0d4f4e934bc25cd806bf4d55dbdb61b596dfaed9cfc772325c69835fb39ee"} Apr 16 18:11:45.558873 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.558842 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9tn4q" event={"ID":"30b4fba5-7d19-4433-b373-76fe14544828","Type":"ContainerStarted","Data":"b17872c79f7af38a7781da9811d0046ad97b6aa39d39d4b4f1d9961cb803a043"} Apr 16 18:11:45.560264 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.560241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" event={"ID":"1c2c8397-6713-4ddc-ab52-af57d514d8c2","Type":"ContainerStarted","Data":"eeb49fdab3b80a2c692b961c59d68ea87fad0a3fc020cb165b36ab8a9da0dab3"} Apr 16 18:11:45.572127 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.572085 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-6sggh" podStartSLOduration=1.583021677 podStartE2EDuration="4.57206949s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:41.470656663 +0000 UTC m=+62.854320111" lastFinishedPulling="2026-04-16 18:11:44.459704457 +0000 UTC m=+65.843367924" observedRunningTime="2026-04-16 18:11:45.571166235 +0000 UTC m=+66.954829709" watchObservedRunningTime="2026-04-16 18:11:45.57206949 +0000 UTC m=+66.955732964" Apr 16 18:11:45.591395 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.591338 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" podStartSLOduration=1.7581044810000002 podStartE2EDuration="4.59131806s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:41.669888067 +0000 UTC m=+63.053551519" lastFinishedPulling="2026-04-16 18:11:44.503101647 +0000 UTC m=+65.886765098" observedRunningTime="2026-04-16 18:11:45.589054834 +0000 UTC m=+66.972718305" watchObservedRunningTime="2026-04-16 18:11:45.59131806 +0000 UTC m=+66.974981534" Apr 16 18:11:45.605173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:45.605115 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" podStartSLOduration=1.787686729 podStartE2EDuration="4.605096238s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:41.68125422 +0000 UTC m=+63.064917669" lastFinishedPulling="2026-04-16 18:11:44.498663714 +0000 UTC m=+65.882327178" observedRunningTime="2026-04-16 18:11:45.604009017 +0000 UTC m=+66.987672502" watchObservedRunningTime="2026-04-16 18:11:45.605096238 +0000 UTC m=+66.988759709" Apr 16 18:11:47.183751 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.183722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9pxxv_9a432113-3c33-4a2e-970d-baa3beba7cc7/dns-node-resolver/0.log" Apr 16 18:11:47.732401 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.732354 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qb5fx"] Apr 16 18:11:47.734675 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.734657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.737505 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.737472 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:11:47.737605 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.737523 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:11:47.741710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.741534 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:11:47.741710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.741538 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6vvqk\"" Apr 16 18:11:47.741710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.741584 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:11:47.745057 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.745017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qb5fx"] Apr 16 18:11:47.834614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.834578 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-cabundle\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.834614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.834613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9dgz\" (UniqueName: \"kubernetes.io/projected/47127e04-5b43-4f23-8fa6-d5d86053657e-kube-api-access-n9dgz\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.834925 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.834656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-key\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.935453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.935421 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-cabundle\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.935658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.935458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9dgz\" (UniqueName: \"kubernetes.io/projected/47127e04-5b43-4f23-8fa6-d5d86053657e-kube-api-access-n9dgz\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.935658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.935509 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-key\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.936089 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.936068 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-cabundle\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.937801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.937784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47127e04-5b43-4f23-8fa6-d5d86053657e-signing-key\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:47.944201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:47.944176 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9dgz\" (UniqueName: \"kubernetes.io/projected/47127e04-5b43-4f23-8fa6-d5d86053657e-kube-api-access-n9dgz\") pod \"service-ca-bfc587fb7-qb5fx\" (UID: \"47127e04-5b43-4f23-8fa6-d5d86053657e\") " pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:48.047212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.047175 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" Apr 16 18:11:48.164708 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.164678 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qb5fx"] Apr 16 18:11:48.168917 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:48.168887 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47127e04_5b43_4f23_8fa6_d5d86053657e.slice/crio-01df5a7a1689a34abe1e90b3bdebd7212a6502fe1228a05862607cc8994ed90a WatchSource:0}: Error finding container 01df5a7a1689a34abe1e90b3bdebd7212a6502fe1228a05862607cc8994ed90a: Status 404 returned error can't find the container with id 01df5a7a1689a34abe1e90b3bdebd7212a6502fe1228a05862607cc8994ed90a Apr 16 18:11:48.383275 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.383248 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5zgqq_797c1b6c-d66b-4ed4-9555-07a34d9d2f2a/node-ca/0.log" Apr 16 18:11:48.568988 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.568890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" event={"ID":"47127e04-5b43-4f23-8fa6-d5d86053657e","Type":"ContainerStarted","Data":"6b6660a5ae66199c56a6fa7acb57df7b39c8c359301efdd4dfd7c55509a61390"} Apr 16 18:11:48.568988 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.568932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" event={"ID":"47127e04-5b43-4f23-8fa6-d5d86053657e","Type":"ContainerStarted","Data":"01df5a7a1689a34abe1e90b3bdebd7212a6502fe1228a05862607cc8994ed90a"} Apr 16 18:11:48.570093 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.570069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9tn4q" event={"ID":"30b4fba5-7d19-4433-b373-76fe14544828","Type":"ContainerStarted","Data":"534bb1c7e4002f554c799ce6c0cfe25055609bd3d0dddf7f37bc3d2ce67fb2ed"} Apr 16 18:11:48.570242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.570228 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:11:48.589272 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.589237 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-qb5fx" podStartSLOduration=1.5892263199999999 podStartE2EDuration="1.58922632s" podCreationTimestamp="2026-04-16 18:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:48.588746144 +0000 UTC m=+69.972409615" watchObservedRunningTime="2026-04-16 18:11:48.58922632 +0000 UTC m=+69.972889790" Apr 16 18:11:48.943921 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.943891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:48.944107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:48.943961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:48.944107 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:48.944019 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.944001102 +0000 UTC m=+78.327664551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:48.944107 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:48.944059 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:48.944107 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:11:48.944101 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs podName:6347d10a-8636-4345-ac41-33e915aa23d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.944090174 +0000 UTC m=+78.327753623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs") pod "router-default-5c668c7d56-vrfwm" (UID: "6347d10a-8636-4345-ac41-33e915aa23d9") : secret "router-metrics-certs-default" not found Apr 16 18:11:57.011397 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.011268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:57.011397 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.011373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:57.012008 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.011986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6347d10a-8636-4345-ac41-33e915aa23d9-service-ca-bundle\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:57.013600 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.013583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6347d10a-8636-4345-ac41-33e915aa23d9-metrics-certs\") pod \"router-default-5c668c7d56-vrfwm\" (UID: \"6347d10a-8636-4345-ac41-33e915aa23d9\") " pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:57.042681 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.042656 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:57.158954 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.158898 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9tn4q" podStartSLOduration=75.171273369 podStartE2EDuration="1m18.158882866s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:11:44.633247249 +0000 UTC m=+66.016910704" lastFinishedPulling="2026-04-16 18:11:47.620856753 +0000 UTC m=+69.004520201" observedRunningTime="2026-04-16 18:11:48.616428672 +0000 UTC m=+70.000092143" watchObservedRunningTime="2026-04-16 18:11:57.158882866 +0000 UTC m=+78.542546336" Apr 16 18:11:57.159644 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.159624 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c668c7d56-vrfwm"] Apr 16 18:11:57.162546 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:11:57.162522 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6347d10a_8636_4345_ac41_33e915aa23d9.slice/crio-85fee028ba55255ff7d5169d4221950f68e1b79b360943f8a4c6fdfd6b241ecb WatchSource:0}: Error finding container 85fee028ba55255ff7d5169d4221950f68e1b79b360943f8a4c6fdfd6b241ecb: Status 404 returned error can't find the container with id 85fee028ba55255ff7d5169d4221950f68e1b79b360943f8a4c6fdfd6b241ecb Apr 16 18:11:57.595951 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.595919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" event={"ID":"6347d10a-8636-4345-ac41-33e915aa23d9","Type":"ContainerStarted","Data":"0f4c15c7e140b2c8152105dd40127da437d9d0eb7de912daafad2de0b8440d68"} Apr 16 18:11:57.596156 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.595959 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" event={"ID":"6347d10a-8636-4345-ac41-33e915aa23d9","Type":"ContainerStarted","Data":"85fee028ba55255ff7d5169d4221950f68e1b79b360943f8a4c6fdfd6b241ecb"} Apr 16 18:11:57.617265 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:57.617215 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" podStartSLOduration=16.61719938 podStartE2EDuration="16.61719938s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:57.616988549 +0000 UTC m=+79.000652015" watchObservedRunningTime="2026-04-16 18:11:57.61719938 +0000 UTC m=+79.000862850" Apr 16 18:11:58.043558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:58.043525 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:58.046296 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:58.046274 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:58.598422 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:58.598392 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:11:58.599623 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:11:58.599604 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5c668c7d56-vrfwm" Apr 16 18:12:06.127986 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.127926 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2z8gk"] Apr 16 18:12:06.132507 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.132486 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.135299 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.135278 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:12:06.135407 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.135360 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7j8vd\"" Apr 16 18:12:06.136472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.136378 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:06.136472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.136445 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:12:06.136623 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.136485 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:06.147465 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.147443 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2z8gk"] Apr 16 18:12:06.178931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.178885 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncw42\" (UniqueName: \"kubernetes.io/projected/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-api-access-ncw42\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.179105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.178935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.179105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.178993 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-crio-socket\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.179105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.179076 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-data-volume\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.179105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.179096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.219632 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.219605 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-b7ts9"] Apr 16 18:12:06.222826 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.222804 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m"] Apr 16 18:12:06.222975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.222958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:06.226633 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.226614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.242024 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242002 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:12:06.242144 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242002 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:12:06.242392 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242373 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4jdhv\"" Apr 16 18:12:06.242673 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242660 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:12:06.242834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242822 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:12:06.242929 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.242915 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tngrk\"" Apr 16 18:12:06.262029 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.262006 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m"] Apr 16 18:12:06.274000 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.273970 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f5899868d-w82sl"] Apr 16 18:12:06.278781 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.278749 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-b7ts9"] Apr 16 18:12:06.279310 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.279498 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64f89b21-4ff4-4616-8331-df624059595f-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.279687 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279662 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-data-volume\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.279801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64f89b21-4ff4-4616-8331-df624059595f-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.279801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncw42\" (UniqueName: \"kubernetes.io/projected/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-api-access-ncw42\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.279801 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279780 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gz4x\" (UniqueName: \"kubernetes.io/projected/7a8a2626-153f-4555-84e3-6c7d86f5db58-kube-api-access-2gz4x\") pod \"downloads-586b57c7b4-b7ts9\" (UID: \"7a8a2626-153f-4555-84e3-6c7d86f5db58\") " pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:06.279966 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.280017 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.279980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-data-volume\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.280128 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.280014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-crio-socket\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.280128 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.280097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.280319 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.280299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-crio-socket\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.280657 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.280633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.282384 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.282365 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.283089 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.283075 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9692l\"" Apr 16 18:12:06.283331 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.283313 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:12:06.283435 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.283413 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:12:06.286317 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.286303 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:12:06.288210 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.288192 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:12:06.297500 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.297478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncw42\" (UniqueName: \"kubernetes.io/projected/d15436f7-1a2d-4358-bca3-f550e2f4f3ed-kube-api-access-ncw42\") pod \"insights-runtime-extractor-2z8gk\" (UID: \"d15436f7-1a2d-4358-bca3-f550e2f4f3ed\") " pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.301628 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.301600 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f5899868d-w82sl"] Apr 16 18:12:06.381236 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64f89b21-4ff4-4616-8331-df624059595f-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.381236 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381203 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-trusted-ca\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381236 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-bound-sa-token\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381455 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-image-registry-private-configuration\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381455 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64f89b21-4ff4-4616-8331-df624059595f-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.381455 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gz4x\" (UniqueName: \"kubernetes.io/projected/7a8a2626-153f-4555-84e3-6c7d86f5db58-kube-api-access-2gz4x\") pod \"downloads-586b57c7b4-b7ts9\" (UID: \"7a8a2626-153f-4555-84e3-6c7d86f5db58\") " pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:06.381455 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381403 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-tls\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-installation-pull-secrets\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-certificates\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381543 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr9c\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-kube-api-access-fzr9c\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-ca-trust-extracted\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.381994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.381971 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64f89b21-4ff4-4616-8331-df624059595f-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.383700 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.383678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64f89b21-4ff4-4616-8331-df624059595f-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6fb2m\" (UID: \"64f89b21-4ff4-4616-8331-df624059595f\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.393873 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.393855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gz4x\" (UniqueName: \"kubernetes.io/projected/7a8a2626-153f-4555-84e3-6c7d86f5db58-kube-api-access-2gz4x\") pod \"downloads-586b57c7b4-b7ts9\" (UID: \"7a8a2626-153f-4555-84e3-6c7d86f5db58\") " pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:06.443117 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.443091 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2z8gk" Apr 16 18:12:06.482653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482620 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-installation-pull-secrets\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482814 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482682 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-certificates\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482814 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr9c\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-kube-api-access-fzr9c\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482814 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-ca-trust-extracted\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482824 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-trusted-ca\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482844 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-bound-sa-token\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-image-registry-private-configuration\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.482955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.482916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-tls\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.483406 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.483349 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-ca-trust-extracted\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.484188 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.484163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-trusted-ca\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.484293 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.484179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-certificates\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.485128 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.485106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-installation-pull-secrets\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.485242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.485228 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-image-registry-private-configuration\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.485602 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.485586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-registry-tls\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.493988 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.493922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr9c\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-kube-api-access-fzr9c\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.494111 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.494007 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8d63a6-6b9b-479e-abd2-e27c25f678c7-bound-sa-token\") pod \"image-registry-6f5899868d-w82sl\" (UID: \"6b8d63a6-6b9b-479e-abd2-e27c25f678c7\") " pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.532732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.532403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:06.538214 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.538189 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" Apr 16 18:12:06.561130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.561097 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2z8gk"] Apr 16 18:12:06.565259 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:06.565226 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15436f7_1a2d_4358_bca3_f550e2f4f3ed.slice/crio-4ea39469f2a7f1b96efc4249a294c84f6ba903d49fc55deb340f6d3978ae5df2 WatchSource:0}: Error finding container 4ea39469f2a7f1b96efc4249a294c84f6ba903d49fc55deb340f6d3978ae5df2: Status 404 returned error can't find the container with id 4ea39469f2a7f1b96efc4249a294c84f6ba903d49fc55deb340f6d3978ae5df2 Apr 16 18:12:06.593896 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.593874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:06.618499 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.618454 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z8gk" event={"ID":"d15436f7-1a2d-4358-bca3-f550e2f4f3ed","Type":"ContainerStarted","Data":"4ea39469f2a7f1b96efc4249a294c84f6ba903d49fc55deb340f6d3978ae5df2"} Apr 16 18:12:06.671276 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.670066 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-b7ts9"] Apr 16 18:12:06.675307 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:06.675213 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8a2626_153f_4555_84e3_6c7d86f5db58.slice/crio-54dd21dcaca56763cc7e273aea1f7068cabe95fe3c5caa09b900ed09e5936e45 WatchSource:0}: Error finding container 54dd21dcaca56763cc7e273aea1f7068cabe95fe3c5caa09b900ed09e5936e45: Status 404 returned error can't find the container with id 54dd21dcaca56763cc7e273aea1f7068cabe95fe3c5caa09b900ed09e5936e45 Apr 16 18:12:06.691448 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.691396 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m"] Apr 16 18:12:06.694741 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:06.694712 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f89b21_4ff4_4616_8331_df624059595f.slice/crio-66019459a6efea3eedfbd128a4680b748e9ad9d5fc41288cfe1714bf71174369 WatchSource:0}: Error finding container 66019459a6efea3eedfbd128a4680b748e9ad9d5fc41288cfe1714bf71174369: Status 404 returned error can't find the container with id 66019459a6efea3eedfbd128a4680b748e9ad9d5fc41288cfe1714bf71174369 Apr 16 18:12:06.750509 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:06.750475 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f5899868d-w82sl"] Apr 16 18:12:06.753372 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:06.753347 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8d63a6_6b9b_479e_abd2_e27c25f678c7.slice/crio-0bf34ba29a44911219a2b6df157669b8a12314331bce884c06c249078a08f99e WatchSource:0}: Error finding container 0bf34ba29a44911219a2b6df157669b8a12314331bce884c06c249078a08f99e: Status 404 returned error can't find the container with id 0bf34ba29a44911219a2b6df157669b8a12314331bce884c06c249078a08f99e Apr 16 18:12:07.625374 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.625337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z8gk" event={"ID":"d15436f7-1a2d-4358-bca3-f550e2f4f3ed","Type":"ContainerStarted","Data":"d0cc3c9e4ac7c9668ca1d9755cbe7b79b9559c123bc5adce5ee62fb80378cb85"} Apr 16 18:12:07.625778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.625386 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z8gk" event={"ID":"d15436f7-1a2d-4358-bca3-f550e2f4f3ed","Type":"ContainerStarted","Data":"46cc4c82145d4e8acdc2e648ad87e22d17362bdf078e58124e248eb6b3a466e6"} Apr 16 18:12:07.627908 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.627030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" event={"ID":"6b8d63a6-6b9b-479e-abd2-e27c25f678c7","Type":"ContainerStarted","Data":"40e9f6b9f45e2eabe949edb321ee601543a7e5cc649ccda949c0fc0c3248bcac"} Apr 16 18:12:07.627908 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.627089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" event={"ID":"6b8d63a6-6b9b-479e-abd2-e27c25f678c7","Type":"ContainerStarted","Data":"0bf34ba29a44911219a2b6df157669b8a12314331bce884c06c249078a08f99e"} Apr 16 18:12:07.627908 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.627853 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:07.629139 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.629103 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" event={"ID":"64f89b21-4ff4-4616-8331-df624059595f","Type":"ContainerStarted","Data":"66019459a6efea3eedfbd128a4680b748e9ad9d5fc41288cfe1714bf71174369"} Apr 16 18:12:07.631159 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.630985 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-b7ts9" event={"ID":"7a8a2626-153f-4555-84e3-6c7d86f5db58","Type":"ContainerStarted","Data":"54dd21dcaca56763cc7e273aea1f7068cabe95fe3c5caa09b900ed09e5936e45"} Apr 16 18:12:07.649897 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:07.649832 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" podStartSLOduration=1.6498150360000001 podStartE2EDuration="1.649815036s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:07.648406888 +0000 UTC m=+89.032070360" watchObservedRunningTime="2026-04-16 18:12:07.649815036 +0000 UTC m=+89.033478506" Apr 16 18:12:08.635422 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:08.635380 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" event={"ID":"64f89b21-4ff4-4616-8331-df624059595f","Type":"ContainerStarted","Data":"ce9cff0e9c35bdb3bbfdf8c9bc650f6544e2a2e6826ebbe7139c638faa444bd2"} Apr 16 18:12:08.653450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:08.653195 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6fb2m" podStartSLOduration=1.4028714469999999 podStartE2EDuration="2.653145223s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.696963671 +0000 UTC m=+88.080627133" lastFinishedPulling="2026-04-16 18:12:07.947237457 +0000 UTC m=+89.330900909" observedRunningTime="2026-04-16 18:12:08.651133607 +0000 UTC m=+90.034797083" watchObservedRunningTime="2026-04-16 18:12:08.653145223 +0000 UTC m=+90.036808687" Apr 16 18:12:09.631458 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.631424 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:12:09.633982 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.633960 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.636832 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.636796 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:12:09.637257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.637085 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:12:09.637257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.637139 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:12:09.638003 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.637982 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fsj6j\"" Apr 16 18:12:09.638300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.638026 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:12:09.638300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.637985 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:12:09.640216 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.640170 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2z8gk" event={"ID":"d15436f7-1a2d-4358-bca3-f550e2f4f3ed","Type":"ContainerStarted","Data":"4371100247ed4469d9c2e0ae90d0286df0dbddec55019004a5f82575a69ceffb"} Apr 16 18:12:09.646622 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.645797 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:12:09.681003 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.680942 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2z8gk" podStartSLOduration=1.6582181729999999 podStartE2EDuration="3.680925636s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.64701781 +0000 UTC m=+88.030681262" lastFinishedPulling="2026-04-16 18:12:08.669725271 +0000 UTC m=+90.053388725" observedRunningTime="2026-04-16 18:12:09.679810234 +0000 UTC m=+91.063473706" watchObservedRunningTime="2026-04-16 18:12:09.680925636 +0000 UTC m=+91.064589101" Apr 16 18:12:09.712124 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.712297 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712263 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.712400 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.712457 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.712457 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92n9k\" (UniqueName: \"kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.712534 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.712472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813421 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813643 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813643 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92n9k\" (UniqueName: \"kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813643 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813798 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.813938 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.813903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.814436 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.814410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.814542 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.814433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.814542 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.814410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.816245 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.816201 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.816571 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.816553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.822337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.822307 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92n9k\" (UniqueName: \"kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k\") pod \"console-9cf779785-7jwsp\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:09.945962 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:09.945923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:10.078397 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:10.078366 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:12:10.081771 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:10.081743 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5495c0_ef6e_434f_a763_06bef1f0a704.slice/crio-e69f6654984f9989ecab6524f9a900bb95181d97074bdb30156d93e05a6bb21b WatchSource:0}: Error finding container e69f6654984f9989ecab6524f9a900bb95181d97074bdb30156d93e05a6bb21b: Status 404 returned error can't find the container with id e69f6654984f9989ecab6524f9a900bb95181d97074bdb30156d93e05a6bb21b Apr 16 18:12:10.645254 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:10.645197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cf779785-7jwsp" event={"ID":"ae5495c0-ef6e-434f-a763-06bef1f0a704","Type":"ContainerStarted","Data":"e69f6654984f9989ecab6524f9a900bb95181d97074bdb30156d93e05a6bb21b"} Apr 16 18:12:13.371668 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.371585 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-s8k28"] Apr 16 18:12:13.377293 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.377267 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.380254 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.380172 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:12:13.380254 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.380185 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:12:13.380419 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.380264 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vpk4j\"" Apr 16 18:12:13.381558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.381313 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:12:13.381558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.381330 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:13.381558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.381328 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:12:13.384978 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.384937 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-s8k28"] Apr 16 18:12:13.403751 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.403717 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-q2xwd"] Apr 16 18:12:13.415881 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.415828 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hh7nj"] Apr 16 18:12:13.416070 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.415987 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.419118 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.419075 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:12:13.419257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.419173 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:12:13.419257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.419185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:12:13.419442 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.419300 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-8gfh7\"" Apr 16 18:12:13.430231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.430206 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-q2xwd"] Apr 16 18:12:13.430365 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.430351 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.433274 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.433253 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:13.433984 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.433964 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:13.434100 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.433964 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:13.434310 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.434290 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gwlxm\"" Apr 16 18:12:13.446357 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.446334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8582l\" (UniqueName: \"kubernetes.io/projected/15bb2e52-9699-4655-905c-281a6c94f097-kube-api-access-8582l\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.446488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.446376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.446488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.446438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15bb2e52-9699-4655-905c-281a6c94f097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.446488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.446471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.547171 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15bb2e52-9699-4655-905c-281a6c94f097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.547356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547194 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.547356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.547356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.547356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-textfile\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:12:13.547359 2567 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6fb9\" (UniqueName: \"kubernetes.io/projected/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-api-access-l6fb9\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-sys\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:12:13.547439 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls podName:15bb2e52-9699-4655-905c-281a6c94f097 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:14.047414965 +0000 UTC m=+95.431078437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-s8k28" (UID: "15bb2e52-9699-4655-905c-281a6c94f097") : secret "openshift-state-metrics-tls" not found Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547510 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a485b132-000d-45ba-814d-0a8ae8aa60b1-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547679 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-root\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-wtmp\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-metrics-client-ca\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.547784 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8582l\" (UniqueName: \"kubernetes.io/projected/15bb2e52-9699-4655-905c-281a6c94f097-kube-api-access-8582l\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.548062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547812 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.548062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.548062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15bb2e52-9699-4655-905c-281a6c94f097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.548062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.547889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km86c\" (UniqueName: \"kubernetes.io/projected/c788c166-fe22-4cdc-919e-d0d5a8ac872f-kube-api-access-km86c\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.550477 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.550449 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.557000 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.556978 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8582l\" (UniqueName: \"kubernetes.io/projected/15bb2e52-9699-4655-905c-281a6c94f097-kube-api-access-8582l\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:13.648823 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km86c\" (UniqueName: \"kubernetes.io/projected/c788c166-fe22-4cdc-919e-d0d5a8ac872f-kube-api-access-km86c\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.648823 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648870 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-textfile\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6fb9\" (UniqueName: \"kubernetes.io/projected/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-api-access-l6fb9\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-sys\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.648974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a485b132-000d-45ba-814d-0a8ae8aa60b1-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-root\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-wtmp\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-metrics-client-ca\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.649389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.650197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-root\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.650197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.649943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-wtmp\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.650197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650067 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.650197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650083 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a485b132-000d-45ba-814d-0a8ae8aa60b1-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.650197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c788c166-fe22-4cdc-919e-d0d5a8ac872f-sys\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.650495 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.650495 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:12:13.650486 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:13.650592 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-metrics-client-ca\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.650649 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:12:13.650620 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls podName:c788c166-fe22-4cdc-919e-d0d5a8ac872f nodeName:}" failed. No retries permitted until 2026-04-16 18:12:14.150536964 +0000 UTC m=+95.534200426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls") pod "node-exporter-hh7nj" (UID: "c788c166-fe22-4cdc-919e-d0d5a8ac872f") : secret "node-exporter-tls" not found Apr 16 18:12:13.651076 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.650990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-textfile\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.651633 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.651438 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.653253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.652895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.653253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.653078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.653595 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.653573 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.657182 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.657121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cf779785-7jwsp" event={"ID":"ae5495c0-ef6e-434f-a763-06bef1f0a704","Type":"ContainerStarted","Data":"a079af9134a1ccd8b9c4fda043a303c5097beaabf320c237ac6aa595c2bbcf1f"} Apr 16 18:12:13.658169 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.658151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km86c\" (UniqueName: \"kubernetes.io/projected/c788c166-fe22-4cdc-919e-d0d5a8ac872f-kube-api-access-km86c\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:13.658274 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.658166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6fb9\" (UniqueName: \"kubernetes.io/projected/a485b132-000d-45ba-814d-0a8ae8aa60b1-kube-api-access-l6fb9\") pod \"kube-state-metrics-7479c89684-q2xwd\" (UID: \"a485b132-000d-45ba-814d-0a8ae8aa60b1\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.674595 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.674548 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9cf779785-7jwsp" podStartSLOduration=1.6980281929999999 podStartE2EDuration="4.674536402s" podCreationTimestamp="2026-04-16 18:12:09 +0000 UTC" firstStartedPulling="2026-04-16 18:12:10.084662877 +0000 UTC m=+91.468326326" lastFinishedPulling="2026-04-16 18:12:13.061171073 +0000 UTC m=+94.444834535" observedRunningTime="2026-04-16 18:12:13.674106382 +0000 UTC m=+95.057769860" watchObservedRunningTime="2026-04-16 18:12:13.674536402 +0000 UTC m=+95.058199873" Apr 16 18:12:13.730443 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.730413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" Apr 16 18:12:13.868461 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:13.868406 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-q2xwd"] Apr 16 18:12:13.871515 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:13.871471 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda485b132_000d_45ba_814d_0a8ae8aa60b1.slice/crio-f7693498024e10919414bae5096ac36fdcba9274780a225d22fd92f696711016 WatchSource:0}: Error finding container f7693498024e10919414bae5096ac36fdcba9274780a225d22fd92f696711016: Status 404 returned error can't find the container with id f7693498024e10919414bae5096ac36fdcba9274780a225d22fd92f696711016 Apr 16 18:12:14.054386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.054348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:14.056815 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.056789 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15bb2e52-9699-4655-905c-281a6c94f097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-s8k28\" (UID: \"15bb2e52-9699-4655-905c-281a6c94f097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:14.155210 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.155160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:14.157790 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.157765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c788c166-fe22-4cdc-919e-d0d5a8ac872f-node-exporter-tls\") pod \"node-exporter-hh7nj\" (UID: \"c788c166-fe22-4cdc-919e-d0d5a8ac872f\") " pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:14.289770 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.289735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" Apr 16 18:12:14.342694 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.342540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hh7nj" Apr 16 18:12:14.441628 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.441472 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-s8k28"] Apr 16 18:12:14.444694 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:14.444635 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15bb2e52_9699_4655_905c_281a6c94f097.slice/crio-ffcde93900e68dc26b5de50cc28dd002ac2dd08f3b8d3adb2a01e8f07db544cf WatchSource:0}: Error finding container ffcde93900e68dc26b5de50cc28dd002ac2dd08f3b8d3adb2a01e8f07db544cf: Status 404 returned error can't find the container with id ffcde93900e68dc26b5de50cc28dd002ac2dd08f3b8d3adb2a01e8f07db544cf Apr 16 18:12:14.662108 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.661997 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh7nj" event={"ID":"c788c166-fe22-4cdc-919e-d0d5a8ac872f","Type":"ContainerStarted","Data":"791cec03c6909b874066c20b7e9f04537374830d7c13b95376930d9926bfd281"} Apr 16 18:12:14.664253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.664226 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" event={"ID":"15bb2e52-9699-4655-905c-281a6c94f097","Type":"ContainerStarted","Data":"9f7d7fb6af5c1ae396850476eae74d24626a74a6500cdb501576cb4380523bb2"} Apr 16 18:12:14.664357 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.664262 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" event={"ID":"15bb2e52-9699-4655-905c-281a6c94f097","Type":"ContainerStarted","Data":"ffcde93900e68dc26b5de50cc28dd002ac2dd08f3b8d3adb2a01e8f07db544cf"} Apr 16 18:12:14.665694 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:14.665633 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" event={"ID":"a485b132-000d-45ba-814d-0a8ae8aa60b1","Type":"ContainerStarted","Data":"f7693498024e10919414bae5096ac36fdcba9274780a225d22fd92f696711016"} Apr 16 18:12:15.470183 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.470151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:12:15.470583 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.470306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:12:15.473367 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.473341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/199e8adf-4b79-4ccd-a556-c3d828499a76-cert\") pod \"ingress-canary-dmjvg\" (UID: \"199e8adf-4b79-4ccd-a556-c3d828499a76\") " pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:12:15.473704 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.473658 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b2f5246-4971-4502-a370-da270e3fd3dc-metrics-tls\") pod \"dns-default-wcq4f\" (UID: \"6b2f5246-4971-4502-a370-da270e3fd3dc\") " pod="openshift-dns/dns-default-wcq4f" Apr 16 18:12:15.671674 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.671640 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh7nj" event={"ID":"c788c166-fe22-4cdc-919e-d0d5a8ac872f","Type":"ContainerStarted","Data":"05848cb9be8f826b7ef10fefe0d61f11e84048badf7aeae6b9c70829366a129a"} Apr 16 18:12:15.674439 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.674397 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" event={"ID":"15bb2e52-9699-4655-905c-281a6c94f097","Type":"ContainerStarted","Data":"10cbeefa8a3bed8aee2836d2957240fff47b91374ed2d130155202841995a794"} Apr 16 18:12:15.677395 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.677347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" event={"ID":"a485b132-000d-45ba-814d-0a8ae8aa60b1","Type":"ContainerStarted","Data":"c543af036489dc989e77d32838af283d90cd79195b51d467c9e0c2d98ca3523a"} Apr 16 18:12:15.727137 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.726912 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:12:15.737740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.736061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcq4f" Apr 16 18:12:15.759709 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.759650 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:12:15.768710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.768323 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dmjvg" Apr 16 18:12:15.945108 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.945029 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcq4f"] Apr 16 18:12:15.950082 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:15.950027 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2f5246_4971_4502_a370_da270e3fd3dc.slice/crio-f6d9058bcb3a27127a164c4c8a83940279224cf51ffe8ce3a293f2d06fc1133f WatchSource:0}: Error finding container f6d9058bcb3a27127a164c4c8a83940279224cf51ffe8ce3a293f2d06fc1133f: Status 404 returned error can't find the container with id f6d9058bcb3a27127a164c4c8a83940279224cf51ffe8ce3a293f2d06fc1133f Apr 16 18:12:15.970368 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:15.970310 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dmjvg"] Apr 16 18:12:15.973761 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:15.973695 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199e8adf_4b79_4ccd_a556_c3d828499a76.slice/crio-f5c613bf804e5d62673b63fa7844565c3b83fc5bb3fc5bee0ebce3cdeb6c19fb WatchSource:0}: Error finding container f5c613bf804e5d62673b63fa7844565c3b83fc5bb3fc5bee0ebce3cdeb6c19fb: Status 404 returned error can't find the container with id f5c613bf804e5d62673b63fa7844565c3b83fc5bb3fc5bee0ebce3cdeb6c19fb Apr 16 18:12:16.689772 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.689736 2567 generic.go:358] "Generic (PLEG): container finished" podID="c788c166-fe22-4cdc-919e-d0d5a8ac872f" containerID="05848cb9be8f826b7ef10fefe0d61f11e84048badf7aeae6b9c70829366a129a" exitCode=0 Apr 16 18:12:16.690484 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.690457 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh7nj" event={"ID":"c788c166-fe22-4cdc-919e-d0d5a8ac872f","Type":"ContainerDied","Data":"05848cb9be8f826b7ef10fefe0d61f11e84048badf7aeae6b9c70829366a129a"} Apr 16 18:12:16.692712 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.692682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" event={"ID":"15bb2e52-9699-4655-905c-281a6c94f097","Type":"ContainerStarted","Data":"ffd1811ff047d86d64a235ab1ecc47feba691db38dbcefd12dfeef05c078c3c4"} Apr 16 18:12:16.695086 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.695062 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dmjvg" event={"ID":"199e8adf-4b79-4ccd-a556-c3d828499a76","Type":"ContainerStarted","Data":"f5c613bf804e5d62673b63fa7844565c3b83fc5bb3fc5bee0ebce3cdeb6c19fb"} Apr 16 18:12:16.698886 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.697928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" event={"ID":"a485b132-000d-45ba-814d-0a8ae8aa60b1","Type":"ContainerStarted","Data":"df1bb6158a7e3da1b61fdb0c1b71b889203b5731356479fad9766289853c8146"} Apr 16 18:12:16.698886 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.697967 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" event={"ID":"a485b132-000d-45ba-814d-0a8ae8aa60b1","Type":"ContainerStarted","Data":"ab72bb019c72cddbc5cf388be6771249d61b144d7b4bc5af0d77efa64372256a"} Apr 16 18:12:16.700322 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.699955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcq4f" event={"ID":"6b2f5246-4971-4502-a370-da270e3fd3dc","Type":"ContainerStarted","Data":"f6d9058bcb3a27127a164c4c8a83940279224cf51ffe8ce3a293f2d06fc1133f"} Apr 16 18:12:16.734480 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.734407 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-s8k28" podStartSLOduration=2.105709374 podStartE2EDuration="3.734389653s" podCreationTimestamp="2026-04-16 18:12:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:14.723292717 +0000 UTC m=+96.106956168" lastFinishedPulling="2026-04-16 18:12:16.351972983 +0000 UTC m=+97.735636447" observedRunningTime="2026-04-16 18:12:16.733847674 +0000 UTC m=+98.117511147" watchObservedRunningTime="2026-04-16 18:12:16.734389653 +0000 UTC m=+98.118053127" Apr 16 18:12:16.756103 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:16.755979 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-q2xwd" podStartSLOduration=2.085626822 podStartE2EDuration="3.755959768s" podCreationTimestamp="2026-04-16 18:12:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:13.87374902 +0000 UTC m=+95.257412469" lastFinishedPulling="2026-04-16 18:12:15.544081962 +0000 UTC m=+96.927745415" observedRunningTime="2026-04-16 18:12:16.754508928 +0000 UTC m=+98.138172400" watchObservedRunningTime="2026-04-16 18:12:16.755959768 +0000 UTC m=+98.139623241" Apr 16 18:12:17.800690 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.800660 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c8979ff68-w29nx"] Apr 16 18:12:17.803510 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.803484 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.806321 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.806297 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:12:17.807589 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.807569 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:12:17.807708 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.807604 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-9gfxw\"" Apr 16 18:12:17.807708 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.807619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4dep9or3gi231\"" Apr 16 18:12:17.807818 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.807704 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:12:17.807867 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.807842 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:12:17.815184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.815161 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c8979ff68-w29nx"] Apr 16 18:12:17.892266 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-metrics-server-audit-profiles\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892439 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-tls\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892439 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-client-certs\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892545 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3cd4878e-7041-4bf3-97b2-ff7307a91899-audit-log\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892545 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892505 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892545 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-client-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.892705 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.892560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ncfw\" (UniqueName: \"kubernetes.io/projected/3cd4878e-7041-4bf3-97b2-ff7307a91899-kube-api-access-7ncfw\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993685 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3cd4878e-7041-4bf3-97b2-ff7307a91899-audit-log\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-client-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ncfw\" (UniqueName: \"kubernetes.io/projected/3cd4878e-7041-4bf3-97b2-ff7307a91899-kube-api-access-7ncfw\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-metrics-server-audit-profiles\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-tls\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.993879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.993872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-client-certs\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.994467 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.994411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3cd4878e-7041-4bf3-97b2-ff7307a91899-audit-log\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.994714 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.994678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.995028 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.994988 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3cd4878e-7041-4bf3-97b2-ff7307a91899-metrics-server-audit-profiles\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.996736 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.996719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-tls\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.996834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.996804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-secret-metrics-server-client-certs\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:17.996918 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:17.996886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd4878e-7041-4bf3-97b2-ff7307a91899-client-ca-bundle\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:18.002316 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.002297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ncfw\" (UniqueName: \"kubernetes.io/projected/3cd4878e-7041-4bf3-97b2-ff7307a91899-kube-api-access-7ncfw\") pod \"metrics-server-7c8979ff68-w29nx\" (UID: \"3cd4878e-7041-4bf3-97b2-ff7307a91899\") " pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:18.115488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.115347 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:18.233248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.233214 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:12:18.237005 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.236981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.249402 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.248726 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:12:18.249402 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.249012 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:12:18.296718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.296684 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m988h\" (UniqueName: \"kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.296872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.296725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.296872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.296839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.296980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.296898 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.296980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.296945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.297118 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.297006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.297118 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.297036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398028 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.397941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398028 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398284 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m988h\" (UniqueName: \"kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.398967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.398949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.399189 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.399090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.399189 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.399153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.401063 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.401011 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.401466 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.401421 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.406880 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.406837 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m988h\" (UniqueName: \"kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h\") pod \"console-74754b9cb7-zw5kc\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:18.553647 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:18.553603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:19.575705 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:19.575671 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9tn4q" Apr 16 18:12:19.947033 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:19.947001 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:19.947227 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:19.947080 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:12:19.948731 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:19.948704 2567 patch_prober.go:28] interesting pod/console-9cf779785-7jwsp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" start-of-body= Apr 16 18:12:19.948851 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:19.948764 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-9cf779785-7jwsp" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" probeResult="failure" output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" Apr 16 18:12:24.606608 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.605334 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:12:24.616272 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:24.616244 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b69ee87_2290_4a15_ae19_7a06468fb617.slice/crio-d1effbd909771fa5ed491744326c4588f38569d5822b37497a6687fd02e6957c WatchSource:0}: Error finding container d1effbd909771fa5ed491744326c4588f38569d5822b37497a6687fd02e6957c: Status 404 returned error can't find the container with id d1effbd909771fa5ed491744326c4588f38569d5822b37497a6687fd02e6957c Apr 16 18:12:24.634201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.633957 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c8979ff68-w29nx"] Apr 16 18:12:24.639655 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:24.639630 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd4878e_7041_4bf3_97b2_ff7307a91899.slice/crio-eb123a3df08fefdbb53342baff7492b9ee3336c8385768b2757db57eb0ea6328 WatchSource:0}: Error finding container eb123a3df08fefdbb53342baff7492b9ee3336c8385768b2757db57eb0ea6328: Status 404 returned error can't find the container with id eb123a3df08fefdbb53342baff7492b9ee3336c8385768b2757db57eb0ea6328 Apr 16 18:12:24.738445 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.738374 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dmjvg" event={"ID":"199e8adf-4b79-4ccd-a556-c3d828499a76","Type":"ContainerStarted","Data":"7fff08a81249f6049afac089681f1a18c3c42401d18396034177dc99a838a3dc"} Apr 16 18:12:24.740951 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.740879 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcq4f" event={"ID":"6b2f5246-4971-4502-a370-da270e3fd3dc","Type":"ContainerStarted","Data":"74bb02e6da35493fb4249830b055735e5d6b3c22ac0be69ead1fd1bf087b73e5"} Apr 16 18:12:24.743399 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.743361 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh7nj" event={"ID":"c788c166-fe22-4cdc-919e-d0d5a8ac872f","Type":"ContainerStarted","Data":"aa4391ef9d0b7e62917a6b7dfc6ad11ca6a31111b0777d7a10cff7de1e680efd"} Apr 16 18:12:24.743399 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.743394 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh7nj" event={"ID":"c788c166-fe22-4cdc-919e-d0d5a8ac872f","Type":"ContainerStarted","Data":"2d7915ce68fb13c0110ee8268433f2575d6d1224d4ceefbf044cd843b65f6d04"} Apr 16 18:12:24.746440 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.746378 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" event={"ID":"3cd4878e-7041-4bf3-97b2-ff7307a91899","Type":"ContainerStarted","Data":"eb123a3df08fefdbb53342baff7492b9ee3336c8385768b2757db57eb0ea6328"} Apr 16 18:12:24.755561 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.753790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-b7ts9" event={"ID":"7a8a2626-153f-4555-84e3-6c7d86f5db58","Type":"ContainerStarted","Data":"3858558dc9a9841a45dfc707ee6f1f1f17f1c63e2959fe35cd1399c35349214f"} Apr 16 18:12:24.755770 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.755721 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:24.756418 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.756396 2567 patch_prober.go:28] interesting pod/downloads-586b57c7b4-b7ts9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" start-of-body= Apr 16 18:12:24.756577 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.756556 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-b7ts9" podUID="7a8a2626-153f-4555-84e3-6c7d86f5db58" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" Apr 16 18:12:24.760327 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.760276 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74754b9cb7-zw5kc" event={"ID":"0b69ee87-2290-4a15-ae19-7a06468fb617","Type":"ContainerStarted","Data":"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981"} Apr 16 18:12:24.760327 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.760304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74754b9cb7-zw5kc" event={"ID":"0b69ee87-2290-4a15-ae19-7a06468fb617","Type":"ContainerStarted","Data":"d1effbd909771fa5ed491744326c4588f38569d5822b37497a6687fd02e6957c"} Apr 16 18:12:24.763493 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.762986 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dmjvg" podStartSLOduration=65.297710533 podStartE2EDuration="1m13.762971381s" podCreationTimestamp="2026-04-16 18:11:11 +0000 UTC" firstStartedPulling="2026-04-16 18:12:15.977288015 +0000 UTC m=+97.360951467" lastFinishedPulling="2026-04-16 18:12:24.442548852 +0000 UTC m=+105.826212315" observedRunningTime="2026-04-16 18:12:24.761532914 +0000 UTC m=+106.145196385" watchObservedRunningTime="2026-04-16 18:12:24.762971381 +0000 UTC m=+106.146634853" Apr 16 18:12:24.785560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.785492 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-b7ts9" podStartSLOduration=0.991724096 podStartE2EDuration="18.785471838s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.678412374 +0000 UTC m=+88.062075827" lastFinishedPulling="2026-04-16 18:12:24.472160105 +0000 UTC m=+105.855823569" observedRunningTime="2026-04-16 18:12:24.783489099 +0000 UTC m=+106.167152572" watchObservedRunningTime="2026-04-16 18:12:24.785471838 +0000 UTC m=+106.169135312" Apr 16 18:12:24.808427 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.806544 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hh7nj" podStartSLOduration=10.629314303 podStartE2EDuration="11.806524546s" podCreationTimestamp="2026-04-16 18:12:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:14.368453415 +0000 UTC m=+95.752116891" lastFinishedPulling="2026-04-16 18:12:15.545663686 +0000 UTC m=+96.929327134" observedRunningTime="2026-04-16 18:12:24.806080508 +0000 UTC m=+106.189743980" watchObservedRunningTime="2026-04-16 18:12:24.806524546 +0000 UTC m=+106.190188018" Apr 16 18:12:24.824658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:24.824565 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74754b9cb7-zw5kc" podStartSLOduration=6.824546599 podStartE2EDuration="6.824546599s" podCreationTimestamp="2026-04-16 18:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:24.823933544 +0000 UTC m=+106.207597016" watchObservedRunningTime="2026-04-16 18:12:24.824546599 +0000 UTC m=+106.208210071" Apr 16 18:12:25.766996 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:25.766938 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcq4f" event={"ID":"6b2f5246-4971-4502-a370-da270e3fd3dc","Type":"ContainerStarted","Data":"d75153f48e6e22aaee7159c973edc27932690702c6e33b20cfcd161a314207e6"} Apr 16 18:12:25.767684 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:25.767660 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wcq4f" Apr 16 18:12:25.778248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:25.778224 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-b7ts9" Apr 16 18:12:25.787265 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:25.786947 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wcq4f" podStartSLOduration=66.292833211 podStartE2EDuration="1m14.786931484s" podCreationTimestamp="2026-04-16 18:11:11 +0000 UTC" firstStartedPulling="2026-04-16 18:12:15.952540667 +0000 UTC m=+97.336204116" lastFinishedPulling="2026-04-16 18:12:24.446638929 +0000 UTC m=+105.830302389" observedRunningTime="2026-04-16 18:12:25.784704737 +0000 UTC m=+107.168368209" watchObservedRunningTime="2026-04-16 18:12:25.786931484 +0000 UTC m=+107.170594958" Apr 16 18:12:26.598829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:26.598748 2567 patch_prober.go:28] interesting pod/image-registry-6f5899868d-w82sl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:26.598829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:26.598813 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" podUID="6b8d63a6-6b9b-479e-abd2-e27c25f678c7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:26.771916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:26.771860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" event={"ID":"3cd4878e-7041-4bf3-97b2-ff7307a91899","Type":"ContainerStarted","Data":"afd08fd396b008403169abc607f30beca0815626c7b4d1720bf0ffa5e4303dd0"} Apr 16 18:12:26.791882 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:26.791823 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" podStartSLOduration=8.187448349 podStartE2EDuration="9.791806688s" podCreationTimestamp="2026-04-16 18:12:17 +0000 UTC" firstStartedPulling="2026-04-16 18:12:24.642009949 +0000 UTC m=+106.025673401" lastFinishedPulling="2026-04-16 18:12:26.246368274 +0000 UTC m=+107.630031740" observedRunningTime="2026-04-16 18:12:26.79037841 +0000 UTC m=+108.174041882" watchObservedRunningTime="2026-04-16 18:12:26.791806688 +0000 UTC m=+108.175470161" Apr 16 18:12:28.554170 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:28.554133 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:28.554626 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:28.554180 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:28.555651 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:28.555624 2567 patch_prober.go:28] interesting pod/console-74754b9cb7-zw5kc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" start-of-body= Apr 16 18:12:28.555779 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:28.555665 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-74754b9cb7-zw5kc" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerName="console" probeResult="failure" output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" Apr 16 18:12:29.645181 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:29.645142 2567 patch_prober.go:28] interesting pod/image-registry-6f5899868d-w82sl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:29.645755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:29.645204 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" podUID="6b8d63a6-6b9b-479e-abd2-e27c25f678c7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:29.947137 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:29.947087 2567 patch_prober.go:28] interesting pod/console-9cf779785-7jwsp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" start-of-body= Apr 16 18:12:29.947397 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:29.947156 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-9cf779785-7jwsp" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" probeResult="failure" output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" Apr 16 18:12:36.598403 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:36.598369 2567 patch_prober.go:28] interesting pod/image-registry-6f5899868d-w82sl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:36.598834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:36.598420 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" podUID="6b8d63a6-6b9b-479e-abd2-e27c25f678c7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:36.777356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:36.777325 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wcq4f" Apr 16 18:12:38.115998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:38.115960 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:38.115998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:38.116005 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:38.559792 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:38.559765 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:38.563891 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:38.563870 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:12:38.621500 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:38.621466 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:12:39.644980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:39.644952 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f5899868d-w82sl" Apr 16 18:12:47.969388 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:47.969353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:12:47.971617 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:47.971597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/059c23ce-c3ea-4b83-a8a4-3c537435306e-metrics-certs\") pod \"network-metrics-daemon-8w622\" (UID: \"059c23ce-c3ea-4b83-a8a4-3c537435306e\") " pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:12:48.258954 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:48.258877 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:12:48.267082 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:48.267062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8w622" Apr 16 18:12:48.405913 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:48.405834 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8w622"] Apr 16 18:12:48.408795 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:12:48.408764 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059c23ce_c3ea_4b83_a8a4_3c537435306e.slice/crio-7101a69d471e8c7e41ac79c962bf79cce678b4d320378162b9086ecf2433caf5 WatchSource:0}: Error finding container 7101a69d471e8c7e41ac79c962bf79cce678b4d320378162b9086ecf2433caf5: Status 404 returned error can't find the container with id 7101a69d471e8c7e41ac79c962bf79cce678b4d320378162b9086ecf2433caf5 Apr 16 18:12:48.841567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:48.841517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8w622" event={"ID":"059c23ce-c3ea-4b83-a8a4-3c537435306e","Type":"ContainerStarted","Data":"7101a69d471e8c7e41ac79c962bf79cce678b4d320378162b9086ecf2433caf5"} Apr 16 18:12:49.846274 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:49.846243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8w622" event={"ID":"059c23ce-c3ea-4b83-a8a4-3c537435306e","Type":"ContainerStarted","Data":"e0756fe850ec5c8e1ddd78666599dbc870ab80a08c345247638769656fd9104a"} Apr 16 18:12:49.846274 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:49.846277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8w622" event={"ID":"059c23ce-c3ea-4b83-a8a4-3c537435306e","Type":"ContainerStarted","Data":"7152650d8bd342bb4e27f7e7965b48bf3fcb9e6f684abe1be1ac8f6d056422d6"} Apr 16 18:12:49.863925 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:49.863877 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8w622" podStartSLOduration=129.906499528 podStartE2EDuration="2m10.863861265s" podCreationTimestamp="2026-04-16 18:10:39 +0000 UTC" firstStartedPulling="2026-04-16 18:12:48.410777998 +0000 UTC m=+129.794441461" lastFinishedPulling="2026-04-16 18:12:49.368139731 +0000 UTC m=+130.751803198" observedRunningTime="2026-04-16 18:12:49.862937313 +0000 UTC m=+131.246600784" watchObservedRunningTime="2026-04-16 18:12:49.863861265 +0000 UTC m=+131.247524736" Apr 16 18:12:56.524897 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:56.524865 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c668c7d56-vrfwm_6347d10a-8636-4345-ac41-33e915aa23d9/router/0.log" Apr 16 18:12:56.536436 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:56.536412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dmjvg_199e8adf-4b79-4ccd-a556-c3d828499a76/serve-healthcheck-canary/0.log" Apr 16 18:12:58.121844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:58.121817 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:12:58.125765 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:12:58.125743 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c8979ff68-w29nx" Apr 16 18:13:00.880923 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:00.880892 2567 generic.go:358] "Generic (PLEG): container finished" podID="2c4b6825-b8eb-410a-94cd-b638a19c37c4" containerID="53a0d4f4e934bc25cd806bf4d55dbdb61b596dfaed9cfc772325c69835fb39ee" exitCode=0 Apr 16 18:13:00.881310 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:00.880965 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" event={"ID":"2c4b6825-b8eb-410a-94cd-b638a19c37c4","Type":"ContainerDied","Data":"53a0d4f4e934bc25cd806bf4d55dbdb61b596dfaed9cfc772325c69835fb39ee"} Apr 16 18:13:00.881310 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:00.881269 2567 scope.go:117] "RemoveContainer" containerID="53a0d4f4e934bc25cd806bf4d55dbdb61b596dfaed9cfc772325c69835fb39ee" Apr 16 18:13:01.884947 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:01.884913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6xrcn" event={"ID":"2c4b6825-b8eb-410a-94cd-b638a19c37c4","Type":"ContainerStarted","Data":"ed23144aeeee51b8f40c66da3e67683a0d84c3efc5711ff1dacc41035ccbe264"} Apr 16 18:13:03.641162 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.641105 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9cf779785-7jwsp" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" containerID="cri-o://a079af9134a1ccd8b9c4fda043a303c5097beaabf320c237ac6aa595c2bbcf1f" gracePeriod=15 Apr 16 18:13:03.891763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.891693 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9cf779785-7jwsp_ae5495c0-ef6e-434f-a763-06bef1f0a704/console/0.log" Apr 16 18:13:03.891763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.891739 2567 generic.go:358] "Generic (PLEG): container finished" podID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerID="a079af9134a1ccd8b9c4fda043a303c5097beaabf320c237ac6aa595c2bbcf1f" exitCode=2 Apr 16 18:13:03.891944 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.891793 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cf779785-7jwsp" event={"ID":"ae5495c0-ef6e-434f-a763-06bef1f0a704","Type":"ContainerDied","Data":"a079af9134a1ccd8b9c4fda043a303c5097beaabf320c237ac6aa595c2bbcf1f"} Apr 16 18:13:03.911033 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.911012 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9cf779785-7jwsp_ae5495c0-ef6e-434f-a763-06bef1f0a704/console/0.log" Apr 16 18:13:03.911168 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:03.911096 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:13:04.005509 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005471 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005509 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005519 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92n9k\" (UniqueName: \"kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005544 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005588 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005625 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005659 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert\") pod \"ae5495c0-ef6e-434f-a763-06bef1f0a704\" (UID: \"ae5495c0-ef6e-434f-a763-06bef1f0a704\") " Apr 16 18:13:04.005964 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.005936 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config" (OuterVolumeSpecName: "console-config") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:04.006062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.006001 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca" (OuterVolumeSpecName: "service-ca") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:04.006062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.006036 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:04.007963 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.007935 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:04.008074 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.007995 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:04.008074 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.008017 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k" (OuterVolumeSpecName: "kube-api-access-92n9k") pod "ae5495c0-ef6e-434f-a763-06bef1f0a704" (UID: "ae5495c0-ef6e-434f-a763-06bef1f0a704"). InnerVolumeSpecName "kube-api-access-92n9k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:04.106482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106445 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-service-ca\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.106482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106474 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-oauth-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.106482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106486 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-oauth-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.106703 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106500 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.106703 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106514 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae5495c0-ef6e-434f-a763-06bef1f0a704-console-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.106703 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.106526 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92n9k\" (UniqueName: \"kubernetes.io/projected/ae5495c0-ef6e-434f-a763-06bef1f0a704-kube-api-access-92n9k\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:13:04.898466 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.898434 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9cf779785-7jwsp_ae5495c0-ef6e-434f-a763-06bef1f0a704/console/0.log" Apr 16 18:13:04.898877 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.898553 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9cf779785-7jwsp" Apr 16 18:13:04.898877 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.898564 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9cf779785-7jwsp" event={"ID":"ae5495c0-ef6e-434f-a763-06bef1f0a704","Type":"ContainerDied","Data":"e69f6654984f9989ecab6524f9a900bb95181d97074bdb30156d93e05a6bb21b"} Apr 16 18:13:04.898877 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.898622 2567 scope.go:117] "RemoveContainer" containerID="a079af9134a1ccd8b9c4fda043a303c5097beaabf320c237ac6aa595c2bbcf1f" Apr 16 18:13:04.920656 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.920630 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:13:04.924297 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:04.924273 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9cf779785-7jwsp"] Apr 16 18:13:05.241650 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:05.241622 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" path="/var/lib/kubelet/pods/ae5495c0-ef6e-434f-a763-06bef1f0a704/volumes" Apr 16 18:13:05.903180 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:05.903146 2567 generic.go:358] "Generic (PLEG): container finished" podID="1c2c8397-6713-4ddc-ab52-af57d514d8c2" containerID="eeb49fdab3b80a2c692b961c59d68ea87fad0a3fc020cb165b36ab8a9da0dab3" exitCode=0 Apr 16 18:13:05.903553 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:05.903219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" event={"ID":"1c2c8397-6713-4ddc-ab52-af57d514d8c2","Type":"ContainerDied","Data":"eeb49fdab3b80a2c692b961c59d68ea87fad0a3fc020cb165b36ab8a9da0dab3"} Apr 16 18:13:05.903553 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:05.903535 2567 scope.go:117] "RemoveContainer" containerID="eeb49fdab3b80a2c692b961c59d68ea87fad0a3fc020cb165b36ab8a9da0dab3" Apr 16 18:13:06.907558 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:06.907521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-87xvn" event={"ID":"1c2c8397-6713-4ddc-ab52-af57d514d8c2","Type":"ContainerStarted","Data":"fc098f4d3d8e199a0572f76b358590bb04b91249b39fb892aa7eb320d02ba7f7"} Apr 16 18:13:41.846519 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.846480 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:13:41.846941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.846922 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" Apr 16 18:13:41.846988 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.846952 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" Apr 16 18:13:41.847071 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.847010 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae5495c0-ef6e-434f-a763-06bef1f0a704" containerName="console" Apr 16 18:13:41.850492 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.850477 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.862430 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.862407 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:13:41.904685 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904647 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2sk\" (UniqueName: \"kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.904847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.904847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.904847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.904847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.904847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:41.905086 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:41.904929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006083 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006083 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw2sk\" (UniqueName: \"kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.006290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.007005 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006979 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.007130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.006982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.007130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.007022 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.007130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.007082 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.009007 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.008982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.009233 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.009212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.014166 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.014149 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw2sk\" (UniqueName: \"kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk\") pod \"console-c5db7f49f-9gjsl\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.159856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.159773 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:42.280069 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:42.280012 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:13:42.283404 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:13:42.283376 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398217c0_98ec_4b26_9fdf_481198b4ff89.slice/crio-76e6ad1f301783b255c18dacf914deecf6a4ef3b12f2488aca58ea3c871aa03f WatchSource:0}: Error finding container 76e6ad1f301783b255c18dacf914deecf6a4ef3b12f2488aca58ea3c871aa03f: Status 404 returned error can't find the container with id 76e6ad1f301783b255c18dacf914deecf6a4ef3b12f2488aca58ea3c871aa03f Apr 16 18:13:43.006805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:43.006768 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5db7f49f-9gjsl" event={"ID":"398217c0-98ec-4b26-9fdf-481198b4ff89","Type":"ContainerStarted","Data":"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0"} Apr 16 18:13:43.006805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:43.006806 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5db7f49f-9gjsl" event={"ID":"398217c0-98ec-4b26-9fdf-481198b4ff89","Type":"ContainerStarted","Data":"76e6ad1f301783b255c18dacf914deecf6a4ef3b12f2488aca58ea3c871aa03f"} Apr 16 18:13:43.025251 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:43.025207 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5db7f49f-9gjsl" podStartSLOduration=2.025191807 podStartE2EDuration="2.025191807s" podCreationTimestamp="2026-04-16 18:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:13:43.023186843 +0000 UTC m=+184.406850314" watchObservedRunningTime="2026-04-16 18:13:43.025191807 +0000 UTC m=+184.408855277" Apr 16 18:13:52.160007 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:52.159972 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:52.160537 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:52.160018 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:52.164729 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:52.164707 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:53.037861 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:53.037835 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:13:53.083548 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:13:53.083520 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:14:18.104770 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.104712 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74754b9cb7-zw5kc" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerName="console" containerID="cri-o://3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981" gracePeriod=15 Apr 16 18:14:18.343548 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.343522 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74754b9cb7-zw5kc_0b69ee87-2290-4a15-ae19-7a06468fb617/console/0.log" Apr 16 18:14:18.343660 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.343583 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:14:18.513111 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513062 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513279 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513143 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513279 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513171 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513279 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513199 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513425 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513346 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m988h\" (UniqueName: \"kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513425 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513396 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513525 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513422 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config\") pod \"0b69ee87-2290-4a15-ae19-7a06468fb617\" (UID: \"0b69ee87-2290-4a15-ae19-7a06468fb617\") " Apr 16 18:14:18.513620 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513592 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:18.513620 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513605 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:18.513748 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513619 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:18.513748 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513714 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-service-ca\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.513748 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513734 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-trusted-ca-bundle\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.513869 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513749 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-oauth-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.513911 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.513893 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config" (OuterVolumeSpecName: "console-config") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:18.515313 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.515288 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:18.515454 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.515429 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:18.515544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.515489 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h" (OuterVolumeSpecName: "kube-api-access-m988h") pod "0b69ee87-2290-4a15-ae19-7a06468fb617" (UID: "0b69ee87-2290-4a15-ae19-7a06468fb617"). InnerVolumeSpecName "kube-api-access-m988h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:18.614909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.614881 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m988h\" (UniqueName: \"kubernetes.io/projected/0b69ee87-2290-4a15-ae19-7a06468fb617-kube-api-access-m988h\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.614909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.614905 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.615086 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.614917 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b69ee87-2290-4a15-ae19-7a06468fb617-console-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:18.615086 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:18.614926 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b69ee87-2290-4a15-ae19-7a06468fb617-console-oauth-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:14:19.108747 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108720 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74754b9cb7-zw5kc_0b69ee87-2290-4a15-ae19-7a06468fb617/console/0.log" Apr 16 18:14:19.109150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108759 2567 generic.go:358] "Generic (PLEG): container finished" podID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerID="3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981" exitCode=2 Apr 16 18:14:19.109150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108789 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74754b9cb7-zw5kc" event={"ID":"0b69ee87-2290-4a15-ae19-7a06468fb617","Type":"ContainerDied","Data":"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981"} Apr 16 18:14:19.109150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108820 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74754b9cb7-zw5kc" Apr 16 18:14:19.109150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74754b9cb7-zw5kc" event={"ID":"0b69ee87-2290-4a15-ae19-7a06468fb617","Type":"ContainerDied","Data":"d1effbd909771fa5ed491744326c4588f38569d5822b37497a6687fd02e6957c"} Apr 16 18:14:19.109150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.108841 2567 scope.go:117] "RemoveContainer" containerID="3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981" Apr 16 18:14:19.116768 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.116656 2567 scope.go:117] "RemoveContainer" containerID="3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981" Apr 16 18:14:19.117082 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:14:19.117037 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981\": container with ID starting with 3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981 not found: ID does not exist" containerID="3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981" Apr 16 18:14:19.117178 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.117087 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981"} err="failed to get container status \"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981\": rpc error: code = NotFound desc = could not find container \"3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981\": container with ID starting with 3a7c73dbcd567313b96cad78a97b82cb298d3cba516c9b99520ab48bb901b981 not found: ID does not exist" Apr 16 18:14:19.129829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.129804 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:14:19.133223 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.133203 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74754b9cb7-zw5kc"] Apr 16 18:14:19.241898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:19.241866 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" path="/var/lib/kubelet/pods/0b69ee87-2290-4a15-ae19-7a06468fb617/volumes" Apr 16 18:14:54.501082 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.501006 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:14:54.501530 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.501320 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerName="console" Apr 16 18:14:54.501530 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.501332 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerName="console" Apr 16 18:14:54.501530 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.501397 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b69ee87-2290-4a15-ae19-7a06468fb617" containerName="console" Apr 16 18:14:54.505264 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.505241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.516655 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.516631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:14:54.600431 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600523 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.600613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.600557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpt5\" (UniqueName: \"kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701458 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701458 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701654 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701587 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701654 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpt5\" (UniqueName: \"kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.701814 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.701750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.702220 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.702187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.702405 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.702387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.702482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.702453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.702634 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.702607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.703895 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.703868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.704066 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.704034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.709232 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.709212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpt5\" (UniqueName: \"kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5\") pod \"console-54c68c4599-p82pw\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.814337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.814248 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:14:54.932789 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:54.932749 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:14:54.935354 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:14:54.935325 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe49d306_342c_46cb_a161_7299711e30fc.slice/crio-4f0ab478439e75dd0faf9332885363fd5989fc9ea207da6a067ef1f7bc9be116 WatchSource:0}: Error finding container 4f0ab478439e75dd0faf9332885363fd5989fc9ea207da6a067ef1f7bc9be116: Status 404 returned error can't find the container with id 4f0ab478439e75dd0faf9332885363fd5989fc9ea207da6a067ef1f7bc9be116 Apr 16 18:14:55.210858 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:55.210821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c68c4599-p82pw" event={"ID":"fe49d306-342c-46cb-a161-7299711e30fc","Type":"ContainerStarted","Data":"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7"} Apr 16 18:14:55.210858 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:55.210865 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c68c4599-p82pw" event={"ID":"fe49d306-342c-46cb-a161-7299711e30fc","Type":"ContainerStarted","Data":"4f0ab478439e75dd0faf9332885363fd5989fc9ea207da6a067ef1f7bc9be116"} Apr 16 18:14:55.229485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:14:55.229437 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54c68c4599-p82pw" podStartSLOduration=1.229424009 podStartE2EDuration="1.229424009s" podCreationTimestamp="2026-04-16 18:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:55.228024865 +0000 UTC m=+256.611688335" watchObservedRunningTime="2026-04-16 18:14:55.229424009 +0000 UTC m=+256.613087479" Apr 16 18:15:04.815133 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:04.815099 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:15:04.815133 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:04.815138 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:15:04.819961 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:04.819940 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:15:05.241902 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:05.241875 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:15:05.291419 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:05.291384 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:15:30.315503 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.315442 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c5db7f49f-9gjsl" podUID="398217c0-98ec-4b26-9fdf-481198b4ff89" containerName="console" containerID="cri-o://e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0" gracePeriod=15 Apr 16 18:15:30.542943 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.542914 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5db7f49f-9gjsl_398217c0-98ec-4b26-9fdf-481198b4ff89/console/0.log" Apr 16 18:15:30.543064 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.542976 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:15:30.591536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591475 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591518 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591672 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591543 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591764 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591729 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591764 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591763 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591824 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw2sk\" (UniqueName: \"kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.591931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591845 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:15:30.591931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591868 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config" (OuterVolumeSpecName: "console-config") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:15:30.591931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.591885 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert\") pod \"398217c0-98ec-4b26-9fdf-481198b4ff89\" (UID: \"398217c0-98ec-4b26-9fdf-481198b4ff89\") " Apr 16 18:15:30.592175 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.592150 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-console-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.592175 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.592171 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-oauth-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.592257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.592121 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca" (OuterVolumeSpecName: "service-ca") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:15:30.592360 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.592329 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:15:30.593683 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.593662 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:15:30.593970 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.593944 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk" (OuterVolumeSpecName: "kube-api-access-dw2sk") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "kube-api-access-dw2sk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:15:30.594119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.593974 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "398217c0-98ec-4b26-9fdf-481198b4ff89" (UID: "398217c0-98ec-4b26-9fdf-481198b4ff89"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:15:30.693551 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.693518 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-service-ca\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.693551 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.693547 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398217c0-98ec-4b26-9fdf-481198b4ff89-trusted-ca-bundle\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.693551 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.693558 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dw2sk\" (UniqueName: \"kubernetes.io/projected/398217c0-98ec-4b26-9fdf-481198b4ff89-kube-api-access-dw2sk\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.693763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.693568 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:30.693763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:30.693577 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/398217c0-98ec-4b26-9fdf-481198b4ff89-console-oauth-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:15:31.309832 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309807 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5db7f49f-9gjsl_398217c0-98ec-4b26-9fdf-481198b4ff89/console/0.log" Apr 16 18:15:31.309973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309846 2567 generic.go:358] "Generic (PLEG): container finished" podID="398217c0-98ec-4b26-9fdf-481198b4ff89" containerID="e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0" exitCode=2 Apr 16 18:15:31.309973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5db7f49f-9gjsl" event={"ID":"398217c0-98ec-4b26-9fdf-481198b4ff89","Type":"ContainerDied","Data":"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0"} Apr 16 18:15:31.309973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309911 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5db7f49f-9gjsl" Apr 16 18:15:31.309973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309930 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5db7f49f-9gjsl" event={"ID":"398217c0-98ec-4b26-9fdf-481198b4ff89","Type":"ContainerDied","Data":"76e6ad1f301783b255c18dacf914deecf6a4ef3b12f2488aca58ea3c871aa03f"} Apr 16 18:15:31.309973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.309947 2567 scope.go:117] "RemoveContainer" containerID="e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0" Apr 16 18:15:31.317963 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.317777 2567 scope.go:117] "RemoveContainer" containerID="e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0" Apr 16 18:15:31.318234 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:15:31.318101 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0\": container with ID starting with e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0 not found: ID does not exist" containerID="e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0" Apr 16 18:15:31.318234 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.318147 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0"} err="failed to get container status \"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0\": rpc error: code = NotFound desc = could not find container \"e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0\": container with ID starting with e5ff408e199b55eb6c81c278f66c48e6b8049569d905bc288243fe998da86eb0 not found: ID does not exist" Apr 16 18:15:31.328026 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.328002 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:15:31.330885 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:31.330866 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c5db7f49f-9gjsl"] Apr 16 18:15:33.241193 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:33.241157 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398217c0-98ec-4b26-9fdf-481198b4ff89" path="/var/lib/kubelet/pods/398217c0-98ec-4b26-9fdf-481198b4ff89/volumes" Apr 16 18:15:39.131081 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:15:39.131035 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:16:00.915026 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.914994 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq"] Apr 16 18:16:00.917442 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.915314 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="398217c0-98ec-4b26-9fdf-481198b4ff89" containerName="console" Apr 16 18:16:00.917442 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.915324 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="398217c0-98ec-4b26-9fdf-481198b4ff89" containerName="console" Apr 16 18:16:00.917442 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.915374 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="398217c0-98ec-4b26-9fdf-481198b4ff89" containerName="console" Apr 16 18:16:00.918362 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.918344 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:00.921706 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.921685 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:16:00.922251 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.922233 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:16:00.923948 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.923926 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n6d6h\"" Apr 16 18:16:00.934054 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:00.934022 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq"] Apr 16 18:16:01.033764 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.033725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.033927 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.033781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96b26\" (UniqueName: \"kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.033927 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.033881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.134925 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.134884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96b26\" (UniqueName: \"kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.135073 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.134948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.135073 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.134980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.135324 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.135309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.135364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.135346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.143035 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.143006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96b26\" (UniqueName: \"kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.228068 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.228000 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:01.347117 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.347088 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq"] Apr 16 18:16:01.349458 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:16:01.349431 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ea5910_fa50_4f4b_9b18_a25e880cedc2.slice/crio-210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11 WatchSource:0}: Error finding container 210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11: Status 404 returned error can't find the container with id 210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11 Apr 16 18:16:01.351334 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.351317 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:01.393525 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:01.393485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" event={"ID":"d7ea5910-fa50-4f4b-9b18-a25e880cedc2","Type":"ContainerStarted","Data":"210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11"} Apr 16 18:16:06.410161 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:06.410071 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerID="6e52d86cf7278746286d3339a6da02dedb5128b0656cd56cff274c80390e5f64" exitCode=0 Apr 16 18:16:06.410532 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:06.410184 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" event={"ID":"d7ea5910-fa50-4f4b-9b18-a25e880cedc2","Type":"ContainerDied","Data":"6e52d86cf7278746286d3339a6da02dedb5128b0656cd56cff274c80390e5f64"} Apr 16 18:16:09.421285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:09.421246 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerID="9910e64628aa72342391a55adf76ed3357c408a888f0dd9e7a51ebcf1eb8fd57" exitCode=0 Apr 16 18:16:09.421639 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:09.421329 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" event={"ID":"d7ea5910-fa50-4f4b-9b18-a25e880cedc2","Type":"ContainerDied","Data":"9910e64628aa72342391a55adf76ed3357c408a888f0dd9e7a51ebcf1eb8fd57"} Apr 16 18:16:15.439763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:15.439727 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerID="1773403cc6dd6e845b77f8ecd2097507dbf45280e17417459b84c910362ae7c8" exitCode=0 Apr 16 18:16:15.440128 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:15.439813 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" event={"ID":"d7ea5910-fa50-4f4b-9b18-a25e880cedc2","Type":"ContainerDied","Data":"1773403cc6dd6e845b77f8ecd2097507dbf45280e17417459b84c910362ae7c8"} Apr 16 18:16:16.567692 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.567669 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:16.666849 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.666818 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle\") pod \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " Apr 16 18:16:16.667009 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.666889 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util\") pod \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " Apr 16 18:16:16.667009 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.666916 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96b26\" (UniqueName: \"kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26\") pod \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\" (UID: \"d7ea5910-fa50-4f4b-9b18-a25e880cedc2\") " Apr 16 18:16:16.667491 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.667460 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle" (OuterVolumeSpecName: "bundle") pod "d7ea5910-fa50-4f4b-9b18-a25e880cedc2" (UID: "d7ea5910-fa50-4f4b-9b18-a25e880cedc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:16:16.669230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.669206 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26" (OuterVolumeSpecName: "kube-api-access-96b26") pod "d7ea5910-fa50-4f4b-9b18-a25e880cedc2" (UID: "d7ea5910-fa50-4f4b-9b18-a25e880cedc2"). InnerVolumeSpecName "kube-api-access-96b26". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:16:16.671447 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.671429 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util" (OuterVolumeSpecName: "util") pod "d7ea5910-fa50-4f4b-9b18-a25e880cedc2" (UID: "d7ea5910-fa50-4f4b-9b18-a25e880cedc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:16:16.767812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.767729 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-bundle\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:16:16.767812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.767761 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-util\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:16:16.767812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:16.767772 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96b26\" (UniqueName: \"kubernetes.io/projected/d7ea5910-fa50-4f4b-9b18-a25e880cedc2-kube-api-access-96b26\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:16:17.447297 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:17.447262 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" event={"ID":"d7ea5910-fa50-4f4b-9b18-a25e880cedc2","Type":"ContainerDied","Data":"210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11"} Apr 16 18:16:17.447297 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:17.447297 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210570ff1e563e6952ab9faf9069a33547d2b52949b86df72b1536b51fc9bf11" Apr 16 18:16:17.447478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:17.447311 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2thzq" Apr 16 18:16:22.527970 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.527936 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z"] Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528225 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="extract" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528237 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="extract" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528249 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="pull" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528254 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="pull" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528270 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="util" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528275 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="util" Apr 16 18:16:22.528355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.528319 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7ea5910-fa50-4f4b-9b18-a25e880cedc2" containerName="extract" Apr 16 18:16:22.549846 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.549806 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z"] Apr 16 18:16:22.550016 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.549920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.552648 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.552620 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:16:22.552756 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.552657 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gr622\"" Apr 16 18:16:22.552756 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.552661 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:16:22.552844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.552823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:16:22.714938 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.714897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.715112 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.715036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xh5\" (UniqueName: \"kubernetes.io/projected/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-kube-api-access-45xh5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.816364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.816286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45xh5\" (UniqueName: \"kubernetes.io/projected/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-kube-api-access-45xh5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.816364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.816332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.818549 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.818522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.824476 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.824454 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xh5\" (UniqueName: \"kubernetes.io/projected/1a283e37-9ce5-4451-be8e-0a3ddc6c29f6-kube-api-access-45xh5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z\" (UID: \"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.859844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.859816 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:22.982633 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:22.982534 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z"] Apr 16 18:16:22.984991 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:16:22.984959 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a283e37_9ce5_4451_be8e_0a3ddc6c29f6.slice/crio-638775dab7ad7919b2621fec8a47f76be54c6baf121fef41675fc7e7ab2ca99b WatchSource:0}: Error finding container 638775dab7ad7919b2621fec8a47f76be54c6baf121fef41675fc7e7ab2ca99b: Status 404 returned error can't find the container with id 638775dab7ad7919b2621fec8a47f76be54c6baf121fef41675fc7e7ab2ca99b Apr 16 18:16:23.464995 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:23.464953 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" event={"ID":"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6","Type":"ContainerStarted","Data":"638775dab7ad7919b2621fec8a47f76be54c6baf121fef41675fc7e7ab2ca99b"} Apr 16 18:16:27.431819 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.431787 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rh8zk"] Apr 16 18:16:27.454960 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.454931 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rh8zk"] Apr 16 18:16:27.455129 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.455084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.457818 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.457794 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:16:27.457946 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.457794 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:16:27.457946 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.457830 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-4998z\"" Apr 16 18:16:27.482210 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.482177 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" event={"ID":"1a283e37-9ce5-4451-be8e-0a3ddc6c29f6","Type":"ContainerStarted","Data":"278747079f9d621eadf816e74046a08c0a95a14a52e4c079f3bc0dd743d43415"} Apr 16 18:16:27.482321 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.482296 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:27.499554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.499511 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" podStartSLOduration=1.574345608 podStartE2EDuration="5.499499998s" podCreationTimestamp="2026-04-16 18:16:22 +0000 UTC" firstStartedPulling="2026-04-16 18:16:22.986618361 +0000 UTC m=+344.370281814" lastFinishedPulling="2026-04-16 18:16:26.911772755 +0000 UTC m=+348.295436204" observedRunningTime="2026-04-16 18:16:27.497535602 +0000 UTC m=+348.881199073" watchObservedRunningTime="2026-04-16 18:16:27.499499998 +0000 UTC m=+348.883163468" Apr 16 18:16:27.561020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.560984 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df558e9e-ab65-4160-8208-79e6f7757fe5-cabundle0\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.561192 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.561083 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9fwr\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-kube-api-access-t9fwr\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.561192 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.561127 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.662075 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.662020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df558e9e-ab65-4160-8208-79e6f7757fe5-cabundle0\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.662259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.662104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9fwr\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-kube-api-access-t9fwr\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.662259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.662144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.662429 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:27.662262 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 18:16:27.662429 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:27.662279 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:16:27.662429 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:27.662288 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:16:27.662429 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:27.662304 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rh8zk: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:16:27.662429 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:27.662378 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates podName:df558e9e-ab65-4160-8208-79e6f7757fe5 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:28.16235799 +0000 UTC m=+349.546021458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates") pod "keda-operator-ffbb595cb-rh8zk" (UID: "df558e9e-ab65-4160-8208-79e6f7757fe5") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:16:27.662817 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.662795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df558e9e-ab65-4160-8208-79e6f7757fe5-cabundle0\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:27.680158 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:27.680133 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9fwr\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-kube-api-access-t9fwr\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:28.166501 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:28.166468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:28.166672 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:28.166587 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:16:28.166672 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:28.166599 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:16:28.166672 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:28.166607 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rh8zk: references non-existent secret key: ca.crt Apr 16 18:16:28.166672 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:28.166653 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates podName:df558e9e-ab65-4160-8208-79e6f7757fe5 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:29.16664075 +0000 UTC m=+350.550304200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates") pod "keda-operator-ffbb595cb-rh8zk" (UID: "df558e9e-ab65-4160-8208-79e6f7757fe5") : references non-existent secret key: ca.crt Apr 16 18:16:29.174800 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:29.174764 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:29.175216 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:29.174902 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:16:29.175216 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:29.174923 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:16:29.175216 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:29.174933 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rh8zk: references non-existent secret key: ca.crt Apr 16 18:16:29.175216 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:16:29.174994 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates podName:df558e9e-ab65-4160-8208-79e6f7757fe5 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:31.174980594 +0000 UTC m=+352.558644044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates") pod "keda-operator-ffbb595cb-rh8zk" (UID: "df558e9e-ab65-4160-8208-79e6f7757fe5") : references non-existent secret key: ca.crt Apr 16 18:16:31.188190 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:31.188149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:31.190524 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:31.190494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df558e9e-ab65-4160-8208-79e6f7757fe5-certificates\") pod \"keda-operator-ffbb595cb-rh8zk\" (UID: \"df558e9e-ab65-4160-8208-79e6f7757fe5\") " pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:31.365576 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:31.365540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:31.488017 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:31.487995 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rh8zk"] Apr 16 18:16:31.490459 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:16:31.490434 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf558e9e_ab65_4160_8208_79e6f7757fe5.slice/crio-b6de3a444da40328c576d3f137faa42c01403a02e4e8d22edff06bddb30f2b89 WatchSource:0}: Error finding container b6de3a444da40328c576d3f137faa42c01403a02e4e8d22edff06bddb30f2b89: Status 404 returned error can't find the container with id b6de3a444da40328c576d3f137faa42c01403a02e4e8d22edff06bddb30f2b89 Apr 16 18:16:31.494815 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:31.494789 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" event={"ID":"df558e9e-ab65-4160-8208-79e6f7757fe5","Type":"ContainerStarted","Data":"b6de3a444da40328c576d3f137faa42c01403a02e4e8d22edff06bddb30f2b89"} Apr 16 18:16:35.508592 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:35.508552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" event={"ID":"df558e9e-ab65-4160-8208-79e6f7757fe5","Type":"ContainerStarted","Data":"6bf9c04dd621416b1f9650072c07319db0aa579018934c625a233f557abcb869"} Apr 16 18:16:35.509147 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:35.508683 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:16:35.525717 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:35.525670 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" podStartSLOduration=5.400481937 podStartE2EDuration="8.525657236s" podCreationTimestamp="2026-04-16 18:16:27 +0000 UTC" firstStartedPulling="2026-04-16 18:16:31.491741192 +0000 UTC m=+352.875404641" lastFinishedPulling="2026-04-16 18:16:34.616916488 +0000 UTC m=+356.000579940" observedRunningTime="2026-04-16 18:16:35.523461667 +0000 UTC m=+356.907125137" watchObservedRunningTime="2026-04-16 18:16:35.525657236 +0000 UTC m=+356.909320706" Apr 16 18:16:48.487561 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:48.487528 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p2c7z" Apr 16 18:16:56.514525 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:16:56.514497 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-rh8zk" Apr 16 18:17:33.696785 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.696748 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:17:33.700016 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.699992 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.702546 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.702527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:17:33.703771 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.703649 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:17:33.703771 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.703716 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-l4r9h\"" Apr 16 18:17:33.703927 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.703654 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:17:33.711370 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.711348 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:17:33.734655 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.734626 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-sf72x"] Apr 16 18:17:33.737514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.737496 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.740112 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.740089 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:17:33.740231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.740129 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qsxc9\"" Apr 16 18:17:33.745702 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.745682 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sf72x"] Apr 16 18:17:33.791511 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.791477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9c5t\" (UniqueName: \"kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.791693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.791524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5kt\" (UniqueName: \"kubernetes.io/projected/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-kube-api-access-cg5kt\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.791693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.791569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.791693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.791607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-data\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.892074 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.892023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9c5t\" (UniqueName: \"kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.892225 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.892094 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5kt\" (UniqueName: \"kubernetes.io/projected/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-kube-api-access-cg5kt\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.892225 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.892156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.892225 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.892183 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-data\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.892607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.892584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-data\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.894452 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.894424 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:33.901580 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.901558 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5kt\" (UniqueName: \"kubernetes.io/projected/6ddb2428-0e30-46e2-a7cb-52b3ad1ee367-kube-api-access-cg5kt\") pod \"seaweedfs-86cc847c5c-sf72x\" (UID: \"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367\") " pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:33.902017 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:33.901996 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9c5t\" (UniqueName: \"kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t\") pod \"kserve-controller-manager-65589c6846-wkjgt\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:34.013409 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.013320 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:34.047457 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.047421 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:34.146826 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.146794 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:17:34.149020 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:17:34.148991 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81ba27da_7a32_4fe2_a37a_24b85f770b8a.slice/crio-caf99cfe03507eaea7dfde382bcd88da1be3e128f550de42664043aa4e5057a8 WatchSource:0}: Error finding container caf99cfe03507eaea7dfde382bcd88da1be3e128f550de42664043aa4e5057a8: Status 404 returned error can't find the container with id caf99cfe03507eaea7dfde382bcd88da1be3e128f550de42664043aa4e5057a8 Apr 16 18:17:34.187704 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.187676 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sf72x"] Apr 16 18:17:34.190750 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:17:34.190726 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ddb2428_0e30_46e2_a7cb_52b3ad1ee367.slice/crio-becf9f54d1ce53ba9c337bb14f42aec664ef995b216f5add56b6cf1d0b22d95f WatchSource:0}: Error finding container becf9f54d1ce53ba9c337bb14f42aec664ef995b216f5add56b6cf1d0b22d95f: Status 404 returned error can't find the container with id becf9f54d1ce53ba9c337bb14f42aec664ef995b216f5add56b6cf1d0b22d95f Apr 16 18:17:34.690238 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.690199 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" event={"ID":"81ba27da-7a32-4fe2-a37a-24b85f770b8a","Type":"ContainerStarted","Data":"caf99cfe03507eaea7dfde382bcd88da1be3e128f550de42664043aa4e5057a8"} Apr 16 18:17:34.691479 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:34.691446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sf72x" event={"ID":"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367","Type":"ContainerStarted","Data":"becf9f54d1ce53ba9c337bb14f42aec664ef995b216f5add56b6cf1d0b22d95f"} Apr 16 18:17:38.706677 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.706641 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" event={"ID":"81ba27da-7a32-4fe2-a37a-24b85f770b8a","Type":"ContainerStarted","Data":"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb"} Apr 16 18:17:38.707158 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.706748 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:17:38.707971 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.707951 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sf72x" event={"ID":"6ddb2428-0e30-46e2-a7cb-52b3ad1ee367","Type":"ContainerStarted","Data":"27cbb2b4b0b878992eb59ffa24b4e5ebab614103e3e735b2362ad2ec36b8008f"} Apr 16 18:17:38.708089 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.708078 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:17:38.723243 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.723201 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" podStartSLOduration=2.167661051 podStartE2EDuration="5.72318698s" podCreationTimestamp="2026-04-16 18:17:33 +0000 UTC" firstStartedPulling="2026-04-16 18:17:34.150370741 +0000 UTC m=+415.534034194" lastFinishedPulling="2026-04-16 18:17:37.705896669 +0000 UTC m=+419.089560123" observedRunningTime="2026-04-16 18:17:38.721439673 +0000 UTC m=+420.105103144" watchObservedRunningTime="2026-04-16 18:17:38.72318698 +0000 UTC m=+420.106850451" Apr 16 18:17:38.737190 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:38.737151 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-sf72x" podStartSLOduration=2.113155679 podStartE2EDuration="5.737138804s" podCreationTimestamp="2026-04-16 18:17:33 +0000 UTC" firstStartedPulling="2026-04-16 18:17:34.191932594 +0000 UTC m=+415.575596043" lastFinishedPulling="2026-04-16 18:17:37.815915718 +0000 UTC m=+419.199579168" observedRunningTime="2026-04-16 18:17:38.735679787 +0000 UTC m=+420.119343261" watchObservedRunningTime="2026-04-16 18:17:38.737138804 +0000 UTC m=+420.120802275" Apr 16 18:17:44.712421 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:17:44.712389 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-sf72x" Apr 16 18:18:09.716205 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.716170 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:18:09.873308 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.873274 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:18:09.873501 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.873480 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" podUID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" containerName="manager" containerID="cri-o://6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb" gracePeriod=10 Apr 16 18:18:09.896088 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.896060 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-m7fxb"] Apr 16 18:18:09.898559 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.898545 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:09.906312 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.906288 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-m7fxb"] Apr 16 18:18:09.991851 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.991821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad965b4d-28c7-48d3-a278-b1f64b8c8284-cert\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:09.991994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:09.991905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprn4\" (UniqueName: \"kubernetes.io/projected/ad965b4d-28c7-48d3-a278-b1f64b8c8284-kube-api-access-cprn4\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.092972 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.092925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cprn4\" (UniqueName: \"kubernetes.io/projected/ad965b4d-28c7-48d3-a278-b1f64b8c8284-kube-api-access-cprn4\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.093172 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.093022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad965b4d-28c7-48d3-a278-b1f64b8c8284-cert\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.095404 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.095382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad965b4d-28c7-48d3-a278-b1f64b8c8284-cert\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.101609 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.101583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprn4\" (UniqueName: \"kubernetes.io/projected/ad965b4d-28c7-48d3-a278-b1f64b8c8284-kube-api-access-cprn4\") pod \"kserve-controller-manager-65589c6846-m7fxb\" (UID: \"ad965b4d-28c7-48d3-a278-b1f64b8c8284\") " pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.112715 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.112692 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:18:10.193336 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.193306 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert\") pod \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " Apr 16 18:18:10.193536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.193353 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9c5t\" (UniqueName: \"kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t\") pod \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\" (UID: \"81ba27da-7a32-4fe2-a37a-24b85f770b8a\") " Apr 16 18:18:10.195371 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.195341 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert" (OuterVolumeSpecName: "cert") pod "81ba27da-7a32-4fe2-a37a-24b85f770b8a" (UID: "81ba27da-7a32-4fe2-a37a-24b85f770b8a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:18:10.195514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.195474 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t" (OuterVolumeSpecName: "kube-api-access-j9c5t") pod "81ba27da-7a32-4fe2-a37a-24b85f770b8a" (UID: "81ba27da-7a32-4fe2-a37a-24b85f770b8a"). InnerVolumeSpecName "kube-api-access-j9c5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:18:10.255471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.255387 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:10.294021 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.293989 2567 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ba27da-7a32-4fe2-a37a-24b85f770b8a-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:18:10.294021 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.294019 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9c5t\" (UniqueName: \"kubernetes.io/projected/81ba27da-7a32-4fe2-a37a-24b85f770b8a-kube-api-access-j9c5t\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:18:10.370576 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.370554 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-m7fxb"] Apr 16 18:18:10.372455 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:18:10.372431 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad965b4d_28c7_48d3_a278_b1f64b8c8284.slice/crio-9f96ff7abf220e1ef06eb7b8e7bb9d81bb7dbd053b19d2fe72bd154d7ee6a6ac WatchSource:0}: Error finding container 9f96ff7abf220e1ef06eb7b8e7bb9d81bb7dbd053b19d2fe72bd154d7ee6a6ac: Status 404 returned error can't find the container with id 9f96ff7abf220e1ef06eb7b8e7bb9d81bb7dbd053b19d2fe72bd154d7ee6a6ac Apr 16 18:18:10.810383 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.810357 2567 generic.go:358] "Generic (PLEG): container finished" podID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" containerID="6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb" exitCode=0 Apr 16 18:18:10.810720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.810412 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" Apr 16 18:18:10.810720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.810436 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" event={"ID":"81ba27da-7a32-4fe2-a37a-24b85f770b8a","Type":"ContainerDied","Data":"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb"} Apr 16 18:18:10.810720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.810467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-wkjgt" event={"ID":"81ba27da-7a32-4fe2-a37a-24b85f770b8a","Type":"ContainerDied","Data":"caf99cfe03507eaea7dfde382bcd88da1be3e128f550de42664043aa4e5057a8"} Apr 16 18:18:10.810720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.810487 2567 scope.go:117] "RemoveContainer" containerID="6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb" Apr 16 18:18:10.811547 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.811520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" event={"ID":"ad965b4d-28c7-48d3-a278-b1f64b8c8284","Type":"ContainerStarted","Data":"9f96ff7abf220e1ef06eb7b8e7bb9d81bb7dbd053b19d2fe72bd154d7ee6a6ac"} Apr 16 18:18:10.821937 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.821917 2567 scope.go:117] "RemoveContainer" containerID="6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb" Apr 16 18:18:10.822200 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:18:10.822181 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb\": container with ID starting with 6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb not found: ID does not exist" containerID="6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb" Apr 16 18:18:10.822270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.822212 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb"} err="failed to get container status \"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb\": rpc error: code = NotFound desc = could not find container \"6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb\": container with ID starting with 6c549a2c62d45b104275ca12602de149ca098615f4eea99a5be74c7c913f54bb not found: ID does not exist" Apr 16 18:18:10.834977 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.834951 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:18:10.839563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:10.839542 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-wkjgt"] Apr 16 18:18:11.241258 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:11.241229 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" path="/var/lib/kubelet/pods/81ba27da-7a32-4fe2-a37a-24b85f770b8a/volumes" Apr 16 18:18:11.816703 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:11.816671 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" event={"ID":"ad965b4d-28c7-48d3-a278-b1f64b8c8284","Type":"ContainerStarted","Data":"c18c8f71e6c9838caa314eda6296e737ef243184f20936a964b96a7fbdb7952f"} Apr 16 18:18:11.817115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:11.816803 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:11.832239 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:11.832180 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" podStartSLOduration=2.414493634 podStartE2EDuration="2.832164115s" podCreationTimestamp="2026-04-16 18:18:09 +0000 UTC" firstStartedPulling="2026-04-16 18:18:10.373619973 +0000 UTC m=+451.757283421" lastFinishedPulling="2026-04-16 18:18:10.791290444 +0000 UTC m=+452.174953902" observedRunningTime="2026-04-16 18:18:11.831807665 +0000 UTC m=+453.215471136" watchObservedRunningTime="2026-04-16 18:18:11.832164115 +0000 UTC m=+453.215827587" Apr 16 18:18:42.823993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:42.823965 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-m7fxb" Apr 16 18:18:43.669975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.669931 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-97659"] Apr 16 18:18:43.670287 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.670272 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" containerName="manager" Apr 16 18:18:43.670287 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.670288 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" containerName="manager" Apr 16 18:18:43.670378 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.670363 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="81ba27da-7a32-4fe2-a37a-24b85f770b8a" containerName="manager" Apr 16 18:18:43.673357 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.673337 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.675871 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.675850 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-f74w4\"" Apr 16 18:18:43.675970 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.675901 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:18:43.684470 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.684441 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-97659"] Apr 16 18:18:43.688221 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.688198 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gmrrc"] Apr 16 18:18:43.690718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.690698 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.693271 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.693249 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-ltshk\"" Apr 16 18:18:43.693385 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.693287 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:18:43.701020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.700996 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gmrrc"] Apr 16 18:18:43.771620 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.771587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77bc6\" (UniqueName: \"kubernetes.io/projected/e4980a96-da9b-4070-92aa-f39af7bedab9-kube-api-access-77bc6\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.771620 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.771625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4980a96-da9b-4070-92aa-f39af7bedab9-tls-certs\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.771873 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.771651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b27\" (UniqueName: \"kubernetes.io/projected/b6487db1-ad50-4523-844f-e7d632a41af0-kube-api-access-j9b27\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.771873 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.771758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6487db1-ad50-4523-844f-e7d632a41af0-cert\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.872898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.872863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6487db1-ad50-4523-844f-e7d632a41af0-cert\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.873305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.872915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77bc6\" (UniqueName: \"kubernetes.io/projected/e4980a96-da9b-4070-92aa-f39af7bedab9-kube-api-access-77bc6\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.873305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.872936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4980a96-da9b-4070-92aa-f39af7bedab9-tls-certs\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.873305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.872962 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b27\" (UniqueName: \"kubernetes.io/projected/b6487db1-ad50-4523-844f-e7d632a41af0-kube-api-access-j9b27\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.875460 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.875439 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4980a96-da9b-4070-92aa-f39af7bedab9-tls-certs\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.875566 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.875533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6487db1-ad50-4523-844f-e7d632a41af0-cert\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.880497 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.880471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77bc6\" (UniqueName: \"kubernetes.io/projected/e4980a96-da9b-4070-92aa-f39af7bedab9-kube-api-access-77bc6\") pod \"model-serving-api-86f7b4b499-97659\" (UID: \"e4980a96-da9b-4070-92aa-f39af7bedab9\") " pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:43.880626 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.880608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b27\" (UniqueName: \"kubernetes.io/projected/b6487db1-ad50-4523-844f-e7d632a41af0-kube-api-access-j9b27\") pod \"odh-model-controller-696fc77849-gmrrc\" (UID: \"b6487db1-ad50-4523-844f-e7d632a41af0\") " pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:43.984687 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:43.984652 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:44.005551 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:44.005514 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:44.110316 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:44.110257 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-97659"] Apr 16 18:18:44.114285 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:18:44.114252 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4980a96_da9b_4070_92aa_f39af7bedab9.slice/crio-6436eac020322ef2b2b6757af37347197d8124b6b598262180369973dc9c0f43 WatchSource:0}: Error finding container 6436eac020322ef2b2b6757af37347197d8124b6b598262180369973dc9c0f43: Status 404 returned error can't find the container with id 6436eac020322ef2b2b6757af37347197d8124b6b598262180369973dc9c0f43 Apr 16 18:18:44.135728 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:44.135702 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gmrrc"] Apr 16 18:18:44.138452 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:18:44.138427 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6487db1_ad50_4523_844f_e7d632a41af0.slice/crio-c52f3e6591b455340d1954453f44e387b0229713126af480b473c71c0caf24e2 WatchSource:0}: Error finding container c52f3e6591b455340d1954453f44e387b0229713126af480b473c71c0caf24e2: Status 404 returned error can't find the container with id c52f3e6591b455340d1954453f44e387b0229713126af480b473c71c0caf24e2 Apr 16 18:18:44.922996 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:44.922958 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-97659" event={"ID":"e4980a96-da9b-4070-92aa-f39af7bedab9","Type":"ContainerStarted","Data":"6436eac020322ef2b2b6757af37347197d8124b6b598262180369973dc9c0f43"} Apr 16 18:18:44.924141 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:44.924104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gmrrc" event={"ID":"b6487db1-ad50-4523-844f-e7d632a41af0","Type":"ContainerStarted","Data":"c52f3e6591b455340d1954453f44e387b0229713126af480b473c71c0caf24e2"} Apr 16 18:18:47.935839 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.935805 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-97659" event={"ID":"e4980a96-da9b-4070-92aa-f39af7bedab9","Type":"ContainerStarted","Data":"819e574d04a7638465512554ed6ae106189b4d84af0dfbc957a8cebc5b4ff17f"} Apr 16 18:18:47.936318 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.935868 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:47.937296 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.937268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gmrrc" event={"ID":"b6487db1-ad50-4523-844f-e7d632a41af0","Type":"ContainerStarted","Data":"55edf363622808d285bb557380156e4311b944c8d84a511c800b1608b575afe4"} Apr 16 18:18:47.937440 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.937363 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:47.951778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.951734 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-97659" podStartSLOduration=1.952009879 podStartE2EDuration="4.951721137s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.116266678 +0000 UTC m=+485.499930127" lastFinishedPulling="2026-04-16 18:18:47.115977933 +0000 UTC m=+488.499641385" observedRunningTime="2026-04-16 18:18:47.950661935 +0000 UTC m=+489.334325407" watchObservedRunningTime="2026-04-16 18:18:47.951721137 +0000 UTC m=+489.335384608" Apr 16 18:18:47.965959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:47.965913 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gmrrc" podStartSLOduration=1.959099331 podStartE2EDuration="4.965898018s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.139639225 +0000 UTC m=+485.523302674" lastFinishedPulling="2026-04-16 18:18:47.146437908 +0000 UTC m=+488.530101361" observedRunningTime="2026-04-16 18:18:47.964582024 +0000 UTC m=+489.348245495" watchObservedRunningTime="2026-04-16 18:18:47.965898018 +0000 UTC m=+489.349561490" Apr 16 18:18:48.492127 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.492097 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-688684b878-lqnzv"] Apr 16 18:18:48.495106 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.495083 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.507432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.507408 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688684b878-lqnzv"] Apr 16 18:18:48.613898 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.613865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-trusted-ca-bundle\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.613914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bg8q\" (UniqueName: \"kubernetes.io/projected/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-kube-api-access-6bg8q\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.613985 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-service-ca\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.614018 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.614065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614256 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.614100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-oauth-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.614256 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.614134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-oauth-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.714736 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.714699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-service-ca\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.714736 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.714740 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.714989 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.714766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.714989 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.714790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-oauth-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.714989 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.714981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-oauth-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.715190 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.715038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-trusted-ca-bundle\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.715190 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.715121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bg8q\" (UniqueName: \"kubernetes.io/projected/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-kube-api-access-6bg8q\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.715597 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.715562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-service-ca\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.715722 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.715665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.715722 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.715684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-oauth-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.716084 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.716032 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-trusted-ca-bundle\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.717398 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.717379 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-serving-cert\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.717533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.717512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-console-oauth-config\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.723671 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.723642 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bg8q\" (UniqueName: \"kubernetes.io/projected/41f0d5a2-a14e-43d3-92d4-f5a2eac53408-kube-api-access-6bg8q\") pod \"console-688684b878-lqnzv\" (UID: \"41f0d5a2-a14e-43d3-92d4-f5a2eac53408\") " pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.804693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.804615 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:48.926474 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.926450 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688684b878-lqnzv"] Apr 16 18:18:48.929126 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:18:48.929101 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f0d5a2_a14e_43d3_92d4_f5a2eac53408.slice/crio-eebc5a869ba82e9ed98d25e0f734968dc6450f1a6a7cb91ecdee4ec4091fd34f WatchSource:0}: Error finding container eebc5a869ba82e9ed98d25e0f734968dc6450f1a6a7cb91ecdee4ec4091fd34f: Status 404 returned error can't find the container with id eebc5a869ba82e9ed98d25e0f734968dc6450f1a6a7cb91ecdee4ec4091fd34f Apr 16 18:18:48.940836 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:48.940797 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688684b878-lqnzv" event={"ID":"41f0d5a2-a14e-43d3-92d4-f5a2eac53408","Type":"ContainerStarted","Data":"eebc5a869ba82e9ed98d25e0f734968dc6450f1a6a7cb91ecdee4ec4091fd34f"} Apr 16 18:18:49.945551 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:49.945517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688684b878-lqnzv" event={"ID":"41f0d5a2-a14e-43d3-92d4-f5a2eac53408","Type":"ContainerStarted","Data":"99ecd071f478359c76ea154cb7572ddacb4f1a98b1cae16cdefe0c4beb131cea"} Apr 16 18:18:49.964629 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:49.964582 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-688684b878-lqnzv" podStartSLOduration=1.964566959 podStartE2EDuration="1.964566959s" podCreationTimestamp="2026-04-16 18:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:18:49.962719627 +0000 UTC m=+491.346383098" watchObservedRunningTime="2026-04-16 18:18:49.964566959 +0000 UTC m=+491.348230430" Apr 16 18:18:58.805344 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.805304 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:58.805344 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.805349 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:58.810141 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.810116 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:58.943696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.943667 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gmrrc" Apr 16 18:18:58.945618 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.945599 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-97659" Apr 16 18:18:58.986396 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:58.986367 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-688684b878-lqnzv" Apr 16 18:18:59.037322 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:18:59.037288 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:19:18.036138 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.036101 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t"] Apr 16 18:19:18.096845 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.096810 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t"] Apr 16 18:19:18.096999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.096925 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.099700 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.099674 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:19:18.099700 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.099689 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 18:19:18.168258 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.168230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bb096826-14ff-440f-837d-056444717b80-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.168379 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.168277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.168379 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.168305 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgw8z\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-kube-api-access-bgw8z\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.269423 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.269395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.269627 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.269436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgw8z\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-kube-api-access-bgw8z\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.269627 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.269490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bb096826-14ff-440f-837d-056444717b80-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.269839 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.269823 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bb096826-14ff-440f-837d-056444717b80-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.271808 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.271788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.277666 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.277644 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgw8z\" (UniqueName: \"kubernetes.io/projected/bb096826-14ff-440f-837d-056444717b80-kube-api-access-bgw8z\") pod \"seaweedfs-tls-custom-5c88b85bb7-wcb2t\" (UID: \"bb096826-14ff-440f-837d-056444717b80\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.406762 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.406696 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" Apr 16 18:19:18.522641 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:18.522599 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t"] Apr 16 18:19:18.525410 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:19:18.525369 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb096826_14ff_440f_837d_056444717b80.slice/crio-a50b68b7b57114828d56254a8d0938dfb9676cff470a6ffbd70cfced020169ca WatchSource:0}: Error finding container a50b68b7b57114828d56254a8d0938dfb9676cff470a6ffbd70cfced020169ca: Status 404 returned error can't find the container with id a50b68b7b57114828d56254a8d0938dfb9676cff470a6ffbd70cfced020169ca Apr 16 18:19:19.056676 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:19.056634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" event={"ID":"bb096826-14ff-440f-837d-056444717b80","Type":"ContainerStarted","Data":"e295d498d9fac41b6ccc643132a5246b057251482084002d4a63733b80b9adb2"} Apr 16 18:19:19.056676 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:19.056681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" event={"ID":"bb096826-14ff-440f-837d-056444717b80","Type":"ContainerStarted","Data":"a50b68b7b57114828d56254a8d0938dfb9676cff470a6ffbd70cfced020169ca"} Apr 16 18:19:19.072969 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:19.072867 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-wcb2t" podStartSLOduration=0.783922891 podStartE2EDuration="1.072847595s" podCreationTimestamp="2026-04-16 18:19:18 +0000 UTC" firstStartedPulling="2026-04-16 18:19:18.526679311 +0000 UTC m=+519.910342760" lastFinishedPulling="2026-04-16 18:19:18.815604014 +0000 UTC m=+520.199267464" observedRunningTime="2026-04-16 18:19:19.072303991 +0000 UTC m=+520.455967462" watchObservedRunningTime="2026-04-16 18:19:19.072847595 +0000 UTC m=+520.456511066" Apr 16 18:19:24.060422 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.060376 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54c68c4599-p82pw" podUID="fe49d306-342c-46cb-a161-7299711e30fc" containerName="console" containerID="cri-o://1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7" gracePeriod=15 Apr 16 18:19:24.296777 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.296753 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c68c4599-p82pw_fe49d306-342c-46cb-a161-7299711e30fc/console/0.log" Apr 16 18:19:24.296909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.296815 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:19:24.325807 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325732 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.325807 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325780 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.325998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325807 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.325998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325882 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cpt5\" (UniqueName: \"kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.325998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325917 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.325998 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.325961 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.326222 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326004 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config\") pod \"fe49d306-342c-46cb-a161-7299711e30fc\" (UID: \"fe49d306-342c-46cb-a161-7299711e30fc\") " Apr 16 18:19:24.326222 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326111 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:24.326323 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326280 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:24.326373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326345 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-service-ca\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.326373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326358 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:24.326484 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.326432 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config" (OuterVolumeSpecName: "console-config") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:24.328132 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.328103 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:24.328234 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.328137 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5" (OuterVolumeSpecName: "kube-api-access-5cpt5") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "kube-api-access-5cpt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:24.328234 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.328157 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fe49d306-342c-46cb-a161-7299711e30fc" (UID: "fe49d306-342c-46cb-a161-7299711e30fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:24.427775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427729 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cpt5\" (UniqueName: \"kubernetes.io/projected/fe49d306-342c-46cb-a161-7299711e30fc-kube-api-access-5cpt5\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.427775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427772 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-oauth-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.427775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427781 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-oauth-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.427775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427793 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-console-config\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.428036 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427802 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe49d306-342c-46cb-a161-7299711e30fc-console-serving-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:24.428036 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:24.427811 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe49d306-342c-46cb-a161-7299711e30fc-trusted-ca-bundle\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:19:25.083504 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c68c4599-p82pw_fe49d306-342c-46cb-a161-7299711e30fc/console/0.log" Apr 16 18:19:25.083959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083513 2567 generic.go:358] "Generic (PLEG): container finished" podID="fe49d306-342c-46cb-a161-7299711e30fc" containerID="1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7" exitCode=2 Apr 16 18:19:25.083959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c68c4599-p82pw" event={"ID":"fe49d306-342c-46cb-a161-7299711e30fc","Type":"ContainerDied","Data":"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7"} Apr 16 18:19:25.083959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c68c4599-p82pw" event={"ID":"fe49d306-342c-46cb-a161-7299711e30fc","Type":"ContainerDied","Data":"4f0ab478439e75dd0faf9332885363fd5989fc9ea207da6a067ef1f7bc9be116"} Apr 16 18:19:25.083959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083596 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c68c4599-p82pw" Apr 16 18:19:25.083959 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.083610 2567 scope.go:117] "RemoveContainer" containerID="1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7" Apr 16 18:19:25.091946 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.091928 2567 scope.go:117] "RemoveContainer" containerID="1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7" Apr 16 18:19:25.092222 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:19:25.092198 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7\": container with ID starting with 1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7 not found: ID does not exist" containerID="1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7" Apr 16 18:19:25.092316 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.092229 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7"} err="failed to get container status \"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7\": rpc error: code = NotFound desc = could not find container \"1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7\": container with ID starting with 1aa591355d8dac80ec9ec8cd2e1d5b3597ed90b7a9b50c9e3697ed0787e4e5e7 not found: ID does not exist" Apr 16 18:19:25.105166 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.105133 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:19:25.108140 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.108119 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54c68c4599-p82pw"] Apr 16 18:19:25.241282 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:25.241250 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe49d306-342c-46cb-a161-7299711e30fc" path="/var/lib/kubelet/pods/fe49d306-342c-46cb-a161-7299711e30fc/volumes" Apr 16 18:19:47.170450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.170409 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:19:47.170912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.170888 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe49d306-342c-46cb-a161-7299711e30fc" containerName="console" Apr 16 18:19:47.170912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.170906 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe49d306-342c-46cb-a161-7299711e30fc" containerName="console" Apr 16 18:19:47.171024 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.171001 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe49d306-342c-46cb-a161-7299711e30fc" containerName="console" Apr 16 18:19:47.174553 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.174531 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:19:47.177031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.177012 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ww52g\"" Apr 16 18:19:47.182340 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.182313 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:19:47.320304 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.320269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4\" (UID: \"48a00c78-e08f-487c-8592-bb5c16e55ddb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:19:47.421253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.421166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4\" (UID: \"48a00c78-e08f-487c-8592-bb5c16e55ddb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:19:47.421600 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.421576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4\" (UID: \"48a00c78-e08f-487c-8592-bb5c16e55ddb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:19:47.486309 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.486274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:19:47.611102 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:47.611072 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:19:47.613890 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:19:47.613864 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a00c78_e08f_487c_8592_bb5c16e55ddb.slice/crio-dae66817bc5fb641cd8c66a01a1844de0db78ee94fff227d5fadec824ecf5405 WatchSource:0}: Error finding container dae66817bc5fb641cd8c66a01a1844de0db78ee94fff227d5fadec824ecf5405: Status 404 returned error can't find the container with id dae66817bc5fb641cd8c66a01a1844de0db78ee94fff227d5fadec824ecf5405 Apr 16 18:19:48.162368 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:48.162336 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerStarted","Data":"dae66817bc5fb641cd8c66a01a1844de0db78ee94fff227d5fadec824ecf5405"} Apr 16 18:19:52.176399 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:52.176360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerStarted","Data":"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32"} Apr 16 18:19:55.187003 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:55.186966 2567 generic.go:358] "Generic (PLEG): container finished" podID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerID="ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32" exitCode=0 Apr 16 18:19:55.187465 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:19:55.187052 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerDied","Data":"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32"} Apr 16 18:20:08.242831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:08.242800 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerStarted","Data":"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b"} Apr 16 18:20:12.257544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:12.257506 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerStarted","Data":"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a"} Apr 16 18:20:12.257967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:12.257716 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:20:12.259094 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:12.259070 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:12.277099 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:12.277028 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podStartSLOduration=1.47334572 podStartE2EDuration="25.277012202s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:47.615689086 +0000 UTC m=+548.999352535" lastFinishedPulling="2026-04-16 18:20:11.419355564 +0000 UTC m=+572.803019017" observedRunningTime="2026-04-16 18:20:12.2757132 +0000 UTC m=+573.659376668" watchObservedRunningTime="2026-04-16 18:20:12.277012202 +0000 UTC m=+573.660675673" Apr 16 18:20:13.260475 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:13.260434 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:20:13.260917 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:13.260499 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:13.261416 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:13.261392 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:14.264326 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:14.264285 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:14.264780 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:14.264726 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:24.265213 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:24.265159 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:24.265666 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:24.265640 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:34.264952 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:34.264899 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:34.265452 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:34.265264 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:44.265271 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:44.265221 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:44.265682 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:44.265656 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:54.265025 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:54.264969 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:20:54.265498 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:20:54.265471 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:04.264516 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:04.264462 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:21:04.266841 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:04.264912 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:14.265201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:14.265166 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:14.265738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:14.265615 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:22.332902 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.332863 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:21:22.333401 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.333352 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" containerID="cri-o://2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b" gracePeriod=30 Apr 16 18:21:22.333532 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.333510 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" containerID="cri-o://297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a" gracePeriod=30 Apr 16 18:21:22.441318 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.441282 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:21:22.445136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.445115 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:22.455855 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.455820 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:21:22.525799 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.525754 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q\" (UID: \"7ac956c4-200a-4c7d-8eba-cbac818ac3d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:22.627260 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.627175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q\" (UID: \"7ac956c4-200a-4c7d-8eba-cbac818ac3d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:22.627562 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.627540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q\" (UID: \"7ac956c4-200a-4c7d-8eba-cbac818ac3d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:22.758080 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.758033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:22.882583 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.882555 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:21:22.885288 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:21:22.885259 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ac956c4_200a_4c7d_8eba_cbac818ac3d3.slice/crio-28c9d2124ff9f0d41352fe7eb2b18a0431cdd565c2fad142c59983a393dca8ec WatchSource:0}: Error finding container 28c9d2124ff9f0d41352fe7eb2b18a0431cdd565c2fad142c59983a393dca8ec: Status 404 returned error can't find the container with id 28c9d2124ff9f0d41352fe7eb2b18a0431cdd565c2fad142c59983a393dca8ec Apr 16 18:21:22.887390 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:22.887367 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:21:23.488740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:23.488705 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerStarted","Data":"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed"} Apr 16 18:21:23.488740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:23.488742 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerStarted","Data":"28c9d2124ff9f0d41352fe7eb2b18a0431cdd565c2fad142c59983a393dca8ec"} Apr 16 18:21:24.265616 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:24.265574 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:24.265847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:24.265710 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:21:27.503153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:27.503120 2567 generic.go:358] "Generic (PLEG): container finished" podID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerID="2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b" exitCode=0 Apr 16 18:21:27.503555 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:27.503197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerDied","Data":"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b"} Apr 16 18:21:27.504523 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:27.504504 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerID="28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed" exitCode=0 Apr 16 18:21:27.504630 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:27.504550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerDied","Data":"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed"} Apr 16 18:21:28.510215 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.510181 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerStarted","Data":"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412"} Apr 16 18:21:28.510215 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.510221 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerStarted","Data":"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511"} Apr 16 18:21:28.510673 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.510532 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:28.510673 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.510562 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:21:28.512200 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.512172 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:21:28.512911 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.512887 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:28.529173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:28.529123 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podStartSLOduration=6.529107384 podStartE2EDuration="6.529107384s" podCreationTimestamp="2026-04-16 18:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:28.52762439 +0000 UTC m=+649.911287860" watchObservedRunningTime="2026-04-16 18:21:28.529107384 +0000 UTC m=+649.912770855" Apr 16 18:21:29.514338 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:29.514295 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:21:29.514785 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:29.514730 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:34.264679 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:34.264625 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:34.265865 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:34.265836 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:21:39.514569 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:39.514514 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:21:39.515020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:39.514993 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:44.264697 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:44.264656 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:44.265106 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:44.264776 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:44.265891 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:44.265862 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:21:44.265997 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:44.265954 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:49.514364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:49.514313 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:21:49.514863 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:49.514686 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:21:52.482641 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.482615 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:52.586132 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.586103 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location\") pod \"48a00c78-e08f-487c-8592-bb5c16e55ddb\" (UID: \"48a00c78-e08f-487c-8592-bb5c16e55ddb\") " Apr 16 18:21:52.586435 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.586409 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48a00c78-e08f-487c-8592-bb5c16e55ddb" (UID: "48a00c78-e08f-487c-8592-bb5c16e55ddb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:52.594051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.593961 2567 generic.go:358] "Generic (PLEG): container finished" podID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerID="297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a" exitCode=0 Apr 16 18:21:52.594051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.594008 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerDied","Data":"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a"} Apr 16 18:21:52.594051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.594030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" event={"ID":"48a00c78-e08f-487c-8592-bb5c16e55ddb","Type":"ContainerDied","Data":"dae66817bc5fb641cd8c66a01a1844de0db78ee94fff227d5fadec824ecf5405"} Apr 16 18:21:52.594263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.594073 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4" Apr 16 18:21:52.594263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.594079 2567 scope.go:117] "RemoveContainer" containerID="297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a" Apr 16 18:21:52.602289 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.602273 2567 scope.go:117] "RemoveContainer" containerID="2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b" Apr 16 18:21:52.609175 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.609159 2567 scope.go:117] "RemoveContainer" containerID="ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32" Apr 16 18:21:52.614836 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.614817 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:21:52.617429 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.617411 2567 scope.go:117] "RemoveContainer" containerID="297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a" Apr 16 18:21:52.617721 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:21:52.617702 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a\": container with ID starting with 297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a not found: ID does not exist" containerID="297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a" Apr 16 18:21:52.617787 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.617729 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a"} err="failed to get container status \"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a\": rpc error: code = NotFound desc = could not find container \"297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a\": container with ID starting with 297091bf53384bec052b046573e87033f4cfe15541caed3537fde09d1ecad15a not found: ID does not exist" Apr 16 18:21:52.617787 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.617747 2567 scope.go:117] "RemoveContainer" containerID="2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b" Apr 16 18:21:52.617987 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:21:52.617969 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b\": container with ID starting with 2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b not found: ID does not exist" containerID="2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b" Apr 16 18:21:52.618073 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.617998 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b"} err="failed to get container status \"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b\": rpc error: code = NotFound desc = could not find container \"2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b\": container with ID starting with 2b900dcf8fe93302fd2d850e7f350b759a32fd464b066733d40c58ad3785ec6b not found: ID does not exist" Apr 16 18:21:52.618073 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.618021 2567 scope.go:117] "RemoveContainer" containerID="ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32" Apr 16 18:21:52.618297 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:21:52.618276 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32\": container with ID starting with ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32 not found: ID does not exist" containerID="ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32" Apr 16 18:21:52.618384 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.618300 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32"} err="failed to get container status \"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32\": rpc error: code = NotFound desc = could not find container \"ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32\": container with ID starting with ea44c1d75e5aa9c01a3b7d6f58fe74629877a95f9462a98da2f9c0df501ecd32 not found: ID does not exist" Apr 16 18:21:52.618432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.618384 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5f9fc5f45c-mdlg4"] Apr 16 18:21:52.687275 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:52.687243 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48a00c78-e08f-487c-8592-bb5c16e55ddb-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:21:53.241872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:53.241830 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" path="/var/lib/kubelet/pods/48a00c78-e08f-487c-8592-bb5c16e55ddb/volumes" Apr 16 18:21:59.514482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:59.514432 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:21:59.514941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:21:59.514863 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:09.514973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:09.514931 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:22:09.515453 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:09.515429 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:19.515121 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:19.515071 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:22:19.515514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:19.515455 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:29.514376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:29.514280 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:22:29.514834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:29.514657 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:30.238175 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:30.238138 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:22:30.238452 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:30.238423 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:40.239285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:40.239255 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:22:40.239715 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:40.239531 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:22:47.520389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:47.520338 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:22:47.520792 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:47.520741 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" containerID="cri-o://c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511" gracePeriod=30 Apr 16 18:22:47.521155 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:47.521126 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" containerID="cri-o://948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412" gracePeriod=30 Apr 16 18:22:50.238291 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:50.238244 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:22:50.238719 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:50.238605 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:22:51.794967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:51.794882 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerID="c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511" exitCode=0 Apr 16 18:22:51.794967 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:51.794955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerDied","Data":"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511"} Apr 16 18:22:57.617253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617219 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617621 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617634 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617647 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617653 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617661 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="storage-initializer" Apr 16 18:22:57.617698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617667 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="storage-initializer" Apr 16 18:22:57.617999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617745 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="kserve-container" Apr 16 18:22:57.617999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.617755 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="48a00c78-e08f-487c-8592-bb5c16e55ddb" containerName="agent" Apr 16 18:22:57.620238 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.620220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:22:57.628299 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.628273 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:22:57.638344 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.638320 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-pwgj9\" (UID: \"0d7e1af1-3db3-46f2-8f27-5040ff1c185e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:22:57.739494 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.739449 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-pwgj9\" (UID: \"0d7e1af1-3db3-46f2-8f27-5040ff1c185e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:22:57.739821 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.739800 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location\") pod \"isvc-logger-predictor-78cf4b7dc8-pwgj9\" (UID: \"0d7e1af1-3db3-46f2-8f27-5040ff1c185e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:22:57.932278 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:57.932183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:22:58.052610 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:58.051934 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:22:58.059469 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:22:58.059442 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7e1af1_3db3_46f2_8f27_5040ff1c185e.slice/crio-d1f142cf98c84b5422f3d309b448ceddef4ebe5a7240a3f1d4c0ab3d2576ebfe WatchSource:0}: Error finding container d1f142cf98c84b5422f3d309b448ceddef4ebe5a7240a3f1d4c0ab3d2576ebfe: Status 404 returned error can't find the container with id d1f142cf98c84b5422f3d309b448ceddef4ebe5a7240a3f1d4c0ab3d2576ebfe Apr 16 18:22:58.819115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:58.819080 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerStarted","Data":"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e"} Apr 16 18:22:58.819115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:22:58.819122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerStarted","Data":"d1f142cf98c84b5422f3d309b448ceddef4ebe5a7240a3f1d4c0ab3d2576ebfe"} Apr 16 18:23:00.238200 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:00.238157 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:23:00.238693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:00.238410 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:01.830920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:01.830887 2567 generic.go:358] "Generic (PLEG): container finished" podID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerID="6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e" exitCode=0 Apr 16 18:23:01.831360 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:01.830954 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerDied","Data":"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e"} Apr 16 18:23:02.836535 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.836503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerStarted","Data":"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4"} Apr 16 18:23:02.836965 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.836542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerStarted","Data":"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e"} Apr 16 18:23:02.836965 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.836834 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:23:02.836965 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.836865 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:23:02.838362 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.838326 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:02.838994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.838970 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:02.853891 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:02.853844 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podStartSLOduration=5.853828307 podStartE2EDuration="5.853828307s" podCreationTimestamp="2026-04-16 18:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:23:02.852002358 +0000 UTC m=+744.235665841" watchObservedRunningTime="2026-04-16 18:23:02.853828307 +0000 UTC m=+744.237491780" Apr 16 18:23:03.839989 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:03.839945 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:03.840448 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:03.840425 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:10.238143 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:10.238092 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:5000: connect: connection refused" Apr 16 18:23:10.238567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:10.238219 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:23:10.238567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:10.238419 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:10.238567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:10.238509 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:23:13.840604 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:13.840540 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:13.841020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:13.841003 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:17.666267 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.666241 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:23:17.814631 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.814540 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location\") pod \"7ac956c4-200a-4c7d-8eba-cbac818ac3d3\" (UID: \"7ac956c4-200a-4c7d-8eba-cbac818ac3d3\") " Apr 16 18:23:17.814840 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.814817 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ac956c4-200a-4c7d-8eba-cbac818ac3d3" (UID: "7ac956c4-200a-4c7d-8eba-cbac818ac3d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:17.887122 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.887086 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerID="948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412" exitCode=0 Apr 16 18:23:17.887266 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.887133 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerDied","Data":"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412"} Apr 16 18:23:17.887266 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.887154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" event={"ID":"7ac956c4-200a-4c7d-8eba-cbac818ac3d3","Type":"ContainerDied","Data":"28c9d2124ff9f0d41352fe7eb2b18a0431cdd565c2fad142c59983a393dca8ec"} Apr 16 18:23:17.887266 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.887166 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q" Apr 16 18:23:17.887539 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.887171 2567 scope.go:117] "RemoveContainer" containerID="948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412" Apr 16 18:23:17.895248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.895127 2567 scope.go:117] "RemoveContainer" containerID="c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511" Apr 16 18:23:17.902758 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.902738 2567 scope.go:117] "RemoveContainer" containerID="28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed" Apr 16 18:23:17.908097 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.908068 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:23:17.910761 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.910741 2567 scope.go:117] "RemoveContainer" containerID="948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412" Apr 16 18:23:17.911022 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:23:17.911003 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412\": container with ID starting with 948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412 not found: ID does not exist" containerID="948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412" Apr 16 18:23:17.911179 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911028 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412"} err="failed to get container status \"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412\": rpc error: code = NotFound desc = could not find container \"948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412\": container with ID starting with 948997b05c6d2db436f128277d13b924de6965450910f61d19e631ff26d55412 not found: ID does not exist" Apr 16 18:23:17.911179 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911072 2567 scope.go:117] "RemoveContainer" containerID="c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511" Apr 16 18:23:17.911352 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:23:17.911335 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511\": container with ID starting with c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511 not found: ID does not exist" containerID="c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511" Apr 16 18:23:17.911391 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911359 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511"} err="failed to get container status \"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511\": rpc error: code = NotFound desc = could not find container \"c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511\": container with ID starting with c55e9e39f387eaade71bd118b02b0de126ba069e1e87f70e63ab0ca2decd5511 not found: ID does not exist" Apr 16 18:23:17.911391 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911376 2567 scope.go:117] "RemoveContainer" containerID="28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed" Apr 16 18:23:17.911509 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911490 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-8b4c87646-mcq7q"] Apr 16 18:23:17.911588 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:23:17.911574 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed\": container with ID starting with 28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed not found: ID does not exist" containerID="28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed" Apr 16 18:23:17.911628 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.911592 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed"} err="failed to get container status \"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed\": rpc error: code = NotFound desc = could not find container \"28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed\": container with ID starting with 28dc9e9dc817ef78abebc0af7c516e99e8c4a68c6187b4a26b4a377410cc7bed not found: ID does not exist" Apr 16 18:23:17.916027 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:17.916009 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ac956c4-200a-4c7d-8eba-cbac818ac3d3-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:23:19.247467 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:19.247434 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" path="/var/lib/kubelet/pods/7ac956c4-200a-4c7d-8eba-cbac818ac3d3/volumes" Apr 16 18:23:23.840277 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:23.840228 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:23.840688 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:23.840667 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:33.840828 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:33.840775 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:33.841353 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:33.841256 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:43.840710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:43.840666 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:43.841213 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:43.841186 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:23:53.840113 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:53.840064 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:23:53.840564 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:23:53.840447 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:24:03.840846 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:03.840748 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:24:03.841304 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:03.841265 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:24:13.841228 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:13.841192 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:13.841724 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:13.841605 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:22.820227 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.820191 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:24:22.821186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.821153 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" containerID="cri-o://7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e" gracePeriod=30 Apr 16 18:24:22.821586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.821433 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" containerID="cri-o://14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4" gracePeriod=30 Apr 16 18:24:22.858092 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858036 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:24:22.858552 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858534 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" Apr 16 18:24:22.858552 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858554 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858563 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="storage-initializer" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858570 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="storage-initializer" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858579 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858585 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858648 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="kserve-container" Apr 16 18:24:22.858669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.858659 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ac956c4-200a-4c7d-8eba-cbac818ac3d3" containerName="agent" Apr 16 18:24:22.861829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.861810 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:22.868983 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.868940 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:24:22.961857 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:22.961821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-hzd78\" (UID: \"d1d950ae-7e57-46be-a424-4febfa2c32c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:23.062444 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.062397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-hzd78\" (UID: \"d1d950ae-7e57-46be-a424-4febfa2c32c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:23.062844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.062818 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-hzd78\" (UID: \"d1d950ae-7e57-46be-a424-4febfa2c32c9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:23.176538 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.176423 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:23.308923 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.308894 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:24:23.310967 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:24:23.310939 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d950ae_7e57_46be_a424_4febfa2c32c9.slice/crio-f30f6dfc55c64b0bf4997694886b000fa6f990086c55e0416df9e92dab945cf4 WatchSource:0}: Error finding container f30f6dfc55c64b0bf4997694886b000fa6f990086c55e0416df9e92dab945cf4: Status 404 returned error can't find the container with id f30f6dfc55c64b0bf4997694886b000fa6f990086c55e0416df9e92dab945cf4 Apr 16 18:24:23.840774 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.840730 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:24:23.841259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:23.841231 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:24:24.112347 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:24.112260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerStarted","Data":"4f5f44af0df2e74ab2f2e96c1ba1c3d8c5d4605c7a09d901fb113c86988982d5"} Apr 16 18:24:24.112347 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:24.112297 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerStarted","Data":"f30f6dfc55c64b0bf4997694886b000fa6f990086c55e0416df9e92dab945cf4"} Apr 16 18:24:27.124958 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:27.124929 2567 generic.go:358] "Generic (PLEG): container finished" podID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerID="7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e" exitCode=0 Apr 16 18:24:27.125376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:27.124964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerDied","Data":"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e"} Apr 16 18:24:28.129263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:28.129227 2567 generic.go:358] "Generic (PLEG): container finished" podID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerID="4f5f44af0df2e74ab2f2e96c1ba1c3d8c5d4605c7a09d901fb113c86988982d5" exitCode=0 Apr 16 18:24:28.129634 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:28.129276 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerDied","Data":"4f5f44af0df2e74ab2f2e96c1ba1c3d8c5d4605c7a09d901fb113c86988982d5"} Apr 16 18:24:33.840409 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:33.840361 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:24:33.840865 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:33.840741 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:24:35.159003 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:35.158967 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerStarted","Data":"9f2030c9e4b97c6e71140a51d56d8a97d3f07da2af215f2d0cbbd0a4d2e6b271"} Apr 16 18:24:35.159447 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:35.159273 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:24:35.160488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:35.160466 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:24:35.174482 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:35.174435 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podStartSLOduration=6.678799859 podStartE2EDuration="13.174419848s" podCreationTimestamp="2026-04-16 18:24:22 +0000 UTC" firstStartedPulling="2026-04-16 18:24:28.130477008 +0000 UTC m=+829.514140457" lastFinishedPulling="2026-04-16 18:24:34.626096988 +0000 UTC m=+836.009760446" observedRunningTime="2026-04-16 18:24:35.173309721 +0000 UTC m=+836.556973193" watchObservedRunningTime="2026-04-16 18:24:35.174419848 +0000 UTC m=+836.558083320" Apr 16 18:24:36.162628 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:36.162592 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:24:43.840447 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:43.840400 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:24:43.840887 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:43.840552 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:43.840887 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:43.840738 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:24:43.840991 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:43.840904 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:46.163337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:46.163289 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:24:53.005263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.005238 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:53.030848 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.030824 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location\") pod \"0d7e1af1-3db3-46f2-8f27-5040ff1c185e\" (UID: \"0d7e1af1-3db3-46f2-8f27-5040ff1c185e\") " Apr 16 18:24:53.031174 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.031152 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d7e1af1-3db3-46f2-8f27-5040ff1c185e" (UID: "0d7e1af1-3db3-46f2-8f27-5040ff1c185e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:53.131818 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.131733 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7e1af1-3db3-46f2-8f27-5040ff1c185e-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:24:53.219656 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.219625 2567 generic.go:358] "Generic (PLEG): container finished" podID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerID="14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4" exitCode=137 Apr 16 18:24:53.219805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.219714 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerDied","Data":"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4"} Apr 16 18:24:53.219805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.219729 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" Apr 16 18:24:53.219805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.219754 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9" event={"ID":"0d7e1af1-3db3-46f2-8f27-5040ff1c185e","Type":"ContainerDied","Data":"d1f142cf98c84b5422f3d309b448ceddef4ebe5a7240a3f1d4c0ab3d2576ebfe"} Apr 16 18:24:53.219805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.219771 2567 scope.go:117] "RemoveContainer" containerID="14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4" Apr 16 18:24:53.227728 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.227712 2567 scope.go:117] "RemoveContainer" containerID="7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e" Apr 16 18:24:53.234559 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.234541 2567 scope.go:117] "RemoveContainer" containerID="6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e" Apr 16 18:24:53.242345 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.242322 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:24:53.243487 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.243442 2567 scope.go:117] "RemoveContainer" containerID="14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4" Apr 16 18:24:53.243780 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:24:53.243750 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4\": container with ID starting with 14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4 not found: ID does not exist" containerID="14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4" Apr 16 18:24:53.243900 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.243789 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4"} err="failed to get container status \"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4\": rpc error: code = NotFound desc = could not find container \"14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4\": container with ID starting with 14006467b27d80da1ca9ae3d8639515e7793c06ed1787389f87960947c87a9d4 not found: ID does not exist" Apr 16 18:24:53.243900 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.243813 2567 scope.go:117] "RemoveContainer" containerID="7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e" Apr 16 18:24:53.244124 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:24:53.244099 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e\": container with ID starting with 7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e not found: ID does not exist" containerID="7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e" Apr 16 18:24:53.244231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.244130 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e"} err="failed to get container status \"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e\": rpc error: code = NotFound desc = could not find container \"7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e\": container with ID starting with 7ca1af3b60890585dceadd87c757760129ff542aad362e04ae72cf8b4e4ed64e not found: ID does not exist" Apr 16 18:24:53.244231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.244155 2567 scope.go:117] "RemoveContainer" containerID="6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e" Apr 16 18:24:53.244418 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:24:53.244402 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e\": container with ID starting with 6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e not found: ID does not exist" containerID="6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e" Apr 16 18:24:53.244478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.244423 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e"} err="failed to get container status \"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e\": rpc error: code = NotFound desc = could not find container \"6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e\": container with ID starting with 6e25882010f29f5a465cf7303aa7c5c27af87f55dcf7eec0bc538e4c5080545e not found: ID does not exist" Apr 16 18:24:53.244949 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:53.244930 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-78cf4b7dc8-pwgj9"] Apr 16 18:24:55.241827 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:55.241796 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" path="/var/lib/kubelet/pods/0d7e1af1-3db3-46f2-8f27-5040ff1c185e/volumes" Apr 16 18:24:56.163607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:24:56.163567 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:06.162728 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:06.162676 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:16.163599 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:16.163557 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:26.163490 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:26.163403 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:36.163004 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:36.162956 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:46.163439 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:46.163396 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:25:56.164088 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:25:56.164034 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:26:03.002645 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.002607 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:26:03.003572 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.003535 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" containerID="cri-o://9f2030c9e4b97c6e71140a51d56d8a97d3f07da2af215f2d0cbbd0a4d2e6b271" gracePeriod=30 Apr 16 18:26:03.154614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154582 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:26:03.154933 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154911 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" Apr 16 18:26:03.154933 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154926 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154947 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="storage-initializer" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154958 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="storage-initializer" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154984 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.154993 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.155088 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="agent" Apr 16 18:26:03.155119 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.155107 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7e1af1-3db3-46f2-8f27-5040ff1c185e" containerName="kserve-container" Apr 16 18:26:03.158270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.158251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:03.165787 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.165767 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:26:03.213636 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.213607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-2fhml\" (UID: \"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:03.314536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.314456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-2fhml\" (UID: \"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:03.314857 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.314835 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-2fhml\" (UID: \"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:03.469546 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.469495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:03.800219 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:03.800194 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:26:03.803064 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:26:03.803016 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c0ec98_7dd9_4a37_92c4_28bda4c3b4c8.slice/crio-4405ff2ce6c518372ef3a72238548b542e29ed82b40a7d018857d434c340b089 WatchSource:0}: Error finding container 4405ff2ce6c518372ef3a72238548b542e29ed82b40a7d018857d434c340b089: Status 404 returned error can't find the container with id 4405ff2ce6c518372ef3a72238548b542e29ed82b40a7d018857d434c340b089 Apr 16 18:26:04.460586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:04.460550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerStarted","Data":"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191"} Apr 16 18:26:04.462938 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:04.460590 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerStarted","Data":"4405ff2ce6c518372ef3a72238548b542e29ed82b40a7d018857d434c340b089"} Apr 16 18:26:06.163431 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:06.163384 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:26:07.472202 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.472165 2567 generic.go:358] "Generic (PLEG): container finished" podID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerID="9f2030c9e4b97c6e71140a51d56d8a97d3f07da2af215f2d0cbbd0a4d2e6b271" exitCode=0 Apr 16 18:26:07.472559 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.472222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerDied","Data":"9f2030c9e4b97c6e71140a51d56d8a97d3f07da2af215f2d0cbbd0a4d2e6b271"} Apr 16 18:26:07.750102 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.750079 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:26:07.854237 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.854202 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location\") pod \"d1d950ae-7e57-46be-a424-4febfa2c32c9\" (UID: \"d1d950ae-7e57-46be-a424-4febfa2c32c9\") " Apr 16 18:26:07.854521 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.854501 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d1d950ae-7e57-46be-a424-4febfa2c32c9" (UID: "d1d950ae-7e57-46be-a424-4febfa2c32c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:07.955597 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:07.955567 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1d950ae-7e57-46be-a424-4febfa2c32c9-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:26:08.476394 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.476367 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" Apr 16 18:26:08.476394 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.476379 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78" event={"ID":"d1d950ae-7e57-46be-a424-4febfa2c32c9","Type":"ContainerDied","Data":"f30f6dfc55c64b0bf4997694886b000fa6f990086c55e0416df9e92dab945cf4"} Apr 16 18:26:08.476869 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.476423 2567 scope.go:117] "RemoveContainer" containerID="9f2030c9e4b97c6e71140a51d56d8a97d3f07da2af215f2d0cbbd0a4d2e6b271" Apr 16 18:26:08.477776 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.477751 2567 generic.go:358] "Generic (PLEG): container finished" podID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerID="814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191" exitCode=0 Apr 16 18:26:08.477885 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.477784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerDied","Data":"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191"} Apr 16 18:26:08.484670 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.484647 2567 scope.go:117] "RemoveContainer" containerID="4f5f44af0df2e74ab2f2e96c1ba1c3d8c5d4605c7a09d901fb113c86988982d5" Apr 16 18:26:08.508949 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.508922 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:26:08.514366 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:08.514339 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-hzd78"] Apr 16 18:26:09.242064 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:09.242005 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" path="/var/lib/kubelet/pods/d1d950ae-7e57-46be-a424-4febfa2c32c9/volumes" Apr 16 18:26:09.482382 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:09.482345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerStarted","Data":"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b"} Apr 16 18:26:09.482827 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:09.482648 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:26:09.483952 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:09.483926 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:26:09.498324 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:09.498251 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podStartSLOduration=6.498237407 podStartE2EDuration="6.498237407s" podCreationTimestamp="2026-04-16 18:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:09.496856355 +0000 UTC m=+930.880519825" watchObservedRunningTime="2026-04-16 18:26:09.498237407 +0000 UTC m=+930.881900878" Apr 16 18:26:10.492160 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:10.491999 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:26:20.488732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:20.488686 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:26:30.488701 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:30.488657 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:26:40.488243 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:40.488203 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:26:50.488741 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:26:50.488698 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:27:00.488463 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:00.488376 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:27:10.488757 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:10.488711 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:27:20.488489 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:20.488444 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:27:30.489247 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:30.489208 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:27:33.797717 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.797679 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:27:33.798179 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.798021 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" containerID="cri-o://ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b" gracePeriod=30 Apr 16 18:27:33.883709 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.883668 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:27:33.884085 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.884037 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="storage-initializer" Apr 16 18:27:33.884085 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.884067 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="storage-initializer" Apr 16 18:27:33.884085 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.884080 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" Apr 16 18:27:33.884298 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.884089 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" Apr 16 18:27:33.884298 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.884168 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d950ae-7e57-46be-a424-4febfa2c32c9" containerName="kserve-container" Apr 16 18:27:33.887852 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.887829 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:27:33.894631 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:33.894606 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:27:34.050605 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.050520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k\" (UID: \"f702c019-3c95-4b4f-b23a-a4d401fab906\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:27:34.151230 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.151196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k\" (UID: \"f702c019-3c95-4b4f-b23a-a4d401fab906\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:27:34.151554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.151534 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k\" (UID: \"f702c019-3c95-4b4f-b23a-a4d401fab906\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:27:34.200831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.200801 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:27:34.319715 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.319688 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:27:34.322185 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:27:34.322156 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf702c019_3c95_4b4f_b23a_a4d401fab906.slice/crio-1b402d12c81aad4fbb643154f55f2315ee0a2b1c1d5e24ef98770ff76a1a43c7 WatchSource:0}: Error finding container 1b402d12c81aad4fbb643154f55f2315ee0a2b1c1d5e24ef98770ff76a1a43c7: Status 404 returned error can't find the container with id 1b402d12c81aad4fbb643154f55f2315ee0a2b1c1d5e24ef98770ff76a1a43c7 Apr 16 18:27:34.323929 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.323914 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:27:34.759373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.759337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerStarted","Data":"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6"} Apr 16 18:27:34.759373 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:34.759375 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerStarted","Data":"1b402d12c81aad4fbb643154f55f2315ee0a2b1c1d5e24ef98770ff76a1a43c7"} Apr 16 18:27:38.440862 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.440835 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:27:38.590771 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.590733 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location\") pod \"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8\" (UID: \"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8\") " Apr 16 18:27:38.591092 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.591071 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" (UID: "87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:38.691999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.691963 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:27:38.772942 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.772908 2567 generic.go:358] "Generic (PLEG): container finished" podID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerID="ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b" exitCode=0 Apr 16 18:27:38.773123 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.772975 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" Apr 16 18:27:38.773123 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.772996 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerDied","Data":"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b"} Apr 16 18:27:38.773123 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.773031 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml" event={"ID":"87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8","Type":"ContainerDied","Data":"4405ff2ce6c518372ef3a72238548b542e29ed82b40a7d018857d434c340b089"} Apr 16 18:27:38.773123 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.773066 2567 scope.go:117] "RemoveContainer" containerID="ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b" Apr 16 18:27:38.774508 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.774486 2567 generic.go:358] "Generic (PLEG): container finished" podID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerID="c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6" exitCode=0 Apr 16 18:27:38.774645 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.774568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerDied","Data":"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6"} Apr 16 18:27:38.781587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.781570 2567 scope.go:117] "RemoveContainer" containerID="814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191" Apr 16 18:27:38.788542 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.788526 2567 scope.go:117] "RemoveContainer" containerID="ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b" Apr 16 18:27:38.788778 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:27:38.788760 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b\": container with ID starting with ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b not found: ID does not exist" containerID="ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b" Apr 16 18:27:38.788845 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.788789 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b"} err="failed to get container status \"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b\": rpc error: code = NotFound desc = could not find container \"ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b\": container with ID starting with ee309a2cc69ef33e5963fe2f38ac8a017dff4d96e555754b37fec65a866b603b not found: ID does not exist" Apr 16 18:27:38.788845 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.788813 2567 scope.go:117] "RemoveContainer" containerID="814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191" Apr 16 18:27:38.789092 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:27:38.789030 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191\": container with ID starting with 814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191 not found: ID does not exist" containerID="814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191" Apr 16 18:27:38.789092 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.789070 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191"} err="failed to get container status \"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191\": rpc error: code = NotFound desc = could not find container \"814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191\": container with ID starting with 814dbc5b9fbe5ddf76ef2c91155964050b2a178a0ac161b9624cdd0d4eb8d191 not found: ID does not exist" Apr 16 18:27:38.803828 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.803806 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:27:38.807841 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:38.807818 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-2fhml"] Apr 16 18:27:39.243602 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:27:39.243569 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" path="/var/lib/kubelet/pods/87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8/volumes" Apr 16 18:29:41.230568 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:29:41.230532 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerStarted","Data":"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c"} Apr 16 18:29:41.230962 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:29:41.230635 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:29:41.254955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:29:41.254904 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" podStartSLOduration=6.656480973 podStartE2EDuration="2m8.254890044s" podCreationTimestamp="2026-04-16 18:27:33 +0000 UTC" firstStartedPulling="2026-04-16 18:27:38.775932132 +0000 UTC m=+1020.159595581" lastFinishedPulling="2026-04-16 18:29:40.374341189 +0000 UTC m=+1141.758004652" observedRunningTime="2026-04-16 18:29:41.252636956 +0000 UTC m=+1142.636300427" watchObservedRunningTime="2026-04-16 18:29:41.254890044 +0000 UTC m=+1142.638553514" Apr 16 18:30:12.239157 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:12.239127 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:30:14.419224 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.419187 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:30:14.419605 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.419458 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="kserve-container" containerID="cri-o://71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c" gracePeriod=30 Apr 16 18:30:14.643115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643078 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:14.643490 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643474 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" Apr 16 18:30:14.643575 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643494 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" Apr 16 18:30:14.643575 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643520 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="storage-initializer" Apr 16 18:30:14.643575 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643529 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="storage-initializer" Apr 16 18:30:14.643724 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.643634 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="87c0ec98-7dd9-4a37-92c4-28bda4c3b4c8" containerName="kserve-container" Apr 16 18:30:14.670180 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.670093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:14.672983 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.672956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:14.792944 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.792907 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2\" (UID: \"59586293-8f8d-44d3-bd1f-5905f1885214\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:14.893789 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.893750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2\" (UID: \"59586293-8f8d-44d3-bd1f-5905f1885214\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:14.894154 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.894134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2\" (UID: \"59586293-8f8d-44d3-bd1f-5905f1885214\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:14.981387 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:14.981355 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:15.165104 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:15.165072 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:15.168116 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:30:15.168090 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59586293_8f8d_44d3_bd1f_5905f1885214.slice/crio-ae1d769c17a34fb026e819787da23081b8e1f070b01e079d6077c893065f7340 WatchSource:0}: Error finding container ae1d769c17a34fb026e819787da23081b8e1f070b01e079d6077c893065f7340: Status 404 returned error can't find the container with id ae1d769c17a34fb026e819787da23081b8e1f070b01e079d6077c893065f7340 Apr 16 18:30:15.349337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:15.349302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerStarted","Data":"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16"} Apr 16 18:30:15.349337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:15.349340 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerStarted","Data":"ae1d769c17a34fb026e819787da23081b8e1f070b01e079d6077c893065f7340"} Apr 16 18:30:18.826350 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:18.826323 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:30:18.926399 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:18.926365 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location\") pod \"f702c019-3c95-4b4f-b23a-a4d401fab906\" (UID: \"f702c019-3c95-4b4f-b23a-a4d401fab906\") " Apr 16 18:30:18.926742 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:18.926719 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f702c019-3c95-4b4f-b23a-a4d401fab906" (UID: "f702c019-3c95-4b4f-b23a-a4d401fab906"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:19.026955 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.026926 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f702c019-3c95-4b4f-b23a-a4d401fab906-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:30:19.363910 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.363877 2567 generic.go:358] "Generic (PLEG): container finished" podID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerID="71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c" exitCode=0 Apr 16 18:30:19.364107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.363955 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" Apr 16 18:30:19.364107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.363961 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerDied","Data":"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c"} Apr 16 18:30:19.364107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.364002 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k" event={"ID":"f702c019-3c95-4b4f-b23a-a4d401fab906","Type":"ContainerDied","Data":"1b402d12c81aad4fbb643154f55f2315ee0a2b1c1d5e24ef98770ff76a1a43c7"} Apr 16 18:30:19.364107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.364019 2567 scope.go:117] "RemoveContainer" containerID="71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c" Apr 16 18:30:19.365449 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.365426 2567 generic.go:358] "Generic (PLEG): container finished" podID="59586293-8f8d-44d3-bd1f-5905f1885214" containerID="efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16" exitCode=0 Apr 16 18:30:19.365554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.365474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerDied","Data":"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16"} Apr 16 18:30:19.373008 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.372988 2567 scope.go:117] "RemoveContainer" containerID="c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6" Apr 16 18:30:19.380167 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.380151 2567 scope.go:117] "RemoveContainer" containerID="71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c" Apr 16 18:30:19.380422 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:30:19.380404 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c\": container with ID starting with 71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c not found: ID does not exist" containerID="71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c" Apr 16 18:30:19.380471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.380433 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c"} err="failed to get container status \"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c\": rpc error: code = NotFound desc = could not find container \"71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c\": container with ID starting with 71496619de260f324e5f7c8679eeacbf27746f12779e612d96485637d160407c not found: ID does not exist" Apr 16 18:30:19.380471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.380450 2567 scope.go:117] "RemoveContainer" containerID="c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6" Apr 16 18:30:19.380683 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:30:19.380663 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6\": container with ID starting with c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6 not found: ID does not exist" containerID="c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6" Apr 16 18:30:19.380741 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.380686 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6"} err="failed to get container status \"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6\": rpc error: code = NotFound desc = could not find container \"c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6\": container with ID starting with c1e7fa1a4f8f2312a754ff4399eff8990d6f449087990f00ab476f5a856098c6 not found: ID does not exist" Apr 16 18:30:19.387724 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.387702 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:30:19.391846 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:19.391825 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-q6t5k"] Apr 16 18:30:20.378174 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:20.378132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerStarted","Data":"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062"} Apr 16 18:30:20.378644 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:20.378466 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:20.380026 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:20.379979 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:30:20.424379 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:20.424325 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" podStartSLOduration=6.424308662 podStartE2EDuration="6.424308662s" podCreationTimestamp="2026-04-16 18:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:20.423145387 +0000 UTC m=+1181.806808860" watchObservedRunningTime="2026-04-16 18:30:20.424308662 +0000 UTC m=+1181.807972133" Apr 16 18:30:21.243460 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:21.243426 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" path="/var/lib/kubelet/pods/f702c019-3c95-4b4f-b23a-a4d401fab906/volumes" Apr 16 18:30:21.383502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:21.383466 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:30:31.384262 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:31.384223 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:34.560093 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.560053 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:34.560559 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.560409 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" containerID="cri-o://1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062" gracePeriod=30 Apr 16 18:30:34.660183 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660146 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:30:34.660504 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660491 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="storage-initializer" Apr 16 18:30:34.660556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660505 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="storage-initializer" Apr 16 18:30:34.660556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660523 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="kserve-container" Apr 16 18:30:34.660556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660530 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="kserve-container" Apr 16 18:30:34.660663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.660586 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f702c019-3c95-4b4f-b23a-a4d401fab906" containerName="kserve-container" Apr 16 18:30:34.662808 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.662787 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:34.684828 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.684804 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:30:34.762239 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.762205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn\" (UID: \"712124bd-b639-4596-8f32-8fd7a947a16e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:34.863539 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.863444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn\" (UID: \"712124bd-b639-4596-8f32-8fd7a947a16e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:34.863844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.863821 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn\" (UID: \"712124bd-b639-4596-8f32-8fd7a947a16e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:34.972970 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:34.972924 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:35.108688 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.108656 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:30:35.111573 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:30:35.111544 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod712124bd_b639_4596_8f32_8fd7a947a16e.slice/crio-5bf66eae16ae3f540fc2a8346f792c4b298b47173d9826b47357b07ae39dd76e WatchSource:0}: Error finding container 5bf66eae16ae3f540fc2a8346f792c4b298b47173d9826b47357b07ae39dd76e: Status 404 returned error can't find the container with id 5bf66eae16ae3f540fc2a8346f792c4b298b47173d9826b47357b07ae39dd76e Apr 16 18:30:35.229189 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.229165 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:35.368276 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.368192 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location\") pod \"59586293-8f8d-44d3-bd1f-5905f1885214\" (UID: \"59586293-8f8d-44d3-bd1f-5905f1885214\") " Apr 16 18:30:35.368649 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.368621 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59586293-8f8d-44d3-bd1f-5905f1885214" (UID: "59586293-8f8d-44d3-bd1f-5905f1885214"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:35.432587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.432548 2567 generic.go:358] "Generic (PLEG): container finished" podID="59586293-8f8d-44d3-bd1f-5905f1885214" containerID="1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062" exitCode=0 Apr 16 18:30:35.432778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.432629 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" Apr 16 18:30:35.432778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.432629 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerDied","Data":"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062"} Apr 16 18:30:35.432778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.432671 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2" event={"ID":"59586293-8f8d-44d3-bd1f-5905f1885214","Type":"ContainerDied","Data":"ae1d769c17a34fb026e819787da23081b8e1f070b01e079d6077c893065f7340"} Apr 16 18:30:35.432778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.432691 2567 scope.go:117] "RemoveContainer" containerID="1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062" Apr 16 18:30:35.434248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.434219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerStarted","Data":"261f16c36163c0f6b1f1cac48420a580ed8901887819b4013b9ba773924b5369"} Apr 16 18:30:35.434248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.434252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerStarted","Data":"5bf66eae16ae3f540fc2a8346f792c4b298b47173d9826b47357b07ae39dd76e"} Apr 16 18:30:35.441536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.441207 2567 scope.go:117] "RemoveContainer" containerID="efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16" Apr 16 18:30:35.449672 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.449649 2567 scope.go:117] "RemoveContainer" containerID="1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062" Apr 16 18:30:35.449960 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:30:35.449934 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062\": container with ID starting with 1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062 not found: ID does not exist" containerID="1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062" Apr 16 18:30:35.450069 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.449968 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062"} err="failed to get container status \"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062\": rpc error: code = NotFound desc = could not find container \"1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062\": container with ID starting with 1c1ddcb90fff035c2ee223f3937ad84e06dda9b5d4b55bb7b6b952f22078d062 not found: ID does not exist" Apr 16 18:30:35.450069 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.449986 2567 scope.go:117] "RemoveContainer" containerID="efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16" Apr 16 18:30:35.450283 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:30:35.450263 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16\": container with ID starting with efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16 not found: ID does not exist" containerID="efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16" Apr 16 18:30:35.450349 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.450291 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16"} err="failed to get container status \"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16\": rpc error: code = NotFound desc = could not find container \"efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16\": container with ID starting with efc7bdc89f5cc416125b34312b046cc33beb3b001812f35fc397bf9a7112af16 not found: ID does not exist" Apr 16 18:30:35.468994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.468959 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59586293-8f8d-44d3-bd1f-5905f1885214-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:30:35.504270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.504233 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:35.524721 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:35.524688 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7qj2"] Apr 16 18:30:37.241523 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:37.241489 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" path="/var/lib/kubelet/pods/59586293-8f8d-44d3-bd1f-5905f1885214/volumes" Apr 16 18:30:39.450567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:39.450538 2567 generic.go:358] "Generic (PLEG): container finished" podID="712124bd-b639-4596-8f32-8fd7a947a16e" containerID="261f16c36163c0f6b1f1cac48420a580ed8901887819b4013b9ba773924b5369" exitCode=0 Apr 16 18:30:39.450947 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:39.450616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerDied","Data":"261f16c36163c0f6b1f1cac48420a580ed8901887819b4013b9ba773924b5369"} Apr 16 18:30:40.455273 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:40.455231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerStarted","Data":"1ee467ac3a76eac74bbda3bfdeae16f03431fc096d6a10f0cc8d8ffe6be828f4"} Apr 16 18:30:40.455631 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:40.455483 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:30:40.491831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:30:40.491781 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" podStartSLOduration=6.491764156 podStartE2EDuration="6.491764156s" podCreationTimestamp="2026-04-16 18:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:40.491228167 +0000 UTC m=+1201.874891638" watchObservedRunningTime="2026-04-16 18:30:40.491764156 +0000 UTC m=+1201.875427628" Apr 16 18:31:11.464147 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:11.464116 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:31:14.873809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:14.873775 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:31:14.874217 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:14.874018 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="kserve-container" containerID="cri-o://1ee467ac3a76eac74bbda3bfdeae16f03431fc096d6a10f0cc8d8ffe6be828f4" gracePeriod=30 Apr 16 18:31:15.012532 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.012491 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:31:15.013030 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.013008 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" Apr 16 18:31:15.013144 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.013037 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" Apr 16 18:31:15.013144 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.013072 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="storage-initializer" Apr 16 18:31:15.013144 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.013081 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="storage-initializer" Apr 16 18:31:15.013243 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.013170 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="59586293-8f8d-44d3-bd1f-5905f1885214" containerName="kserve-container" Apr 16 18:31:15.016271 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.016250 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:15.036633 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.036595 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:31:15.077916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.077879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-b6qxp\" (UID: \"368c76b7-70da-474b-abb4-b3d09dc918f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:15.179029 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.178931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-b6qxp\" (UID: \"368c76b7-70da-474b-abb4-b3d09dc918f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:15.179378 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.179355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65779c8984-b6qxp\" (UID: \"368c76b7-70da-474b-abb4-b3d09dc918f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:15.326555 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.326515 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:15.472036 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.472005 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:31:15.473251 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:31:15.473219 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368c76b7_70da_474b_abb4_b3d09dc918f6.slice/crio-cb1f6bd4e065fce3908f627fbca30bbf70a399f13b11094c021df0b4b814fde6 WatchSource:0}: Error finding container cb1f6bd4e065fce3908f627fbca30bbf70a399f13b11094c021df0b4b814fde6: Status 404 returned error can't find the container with id cb1f6bd4e065fce3908f627fbca30bbf70a399f13b11094c021df0b4b814fde6 Apr 16 18:31:15.575484 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.575447 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerStarted","Data":"0525bedeef3f403ed3e34d57b7955c5b458e1b4e35c2090f4f086563f52c15c1"} Apr 16 18:31:15.575484 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:15.575492 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerStarted","Data":"cb1f6bd4e065fce3908f627fbca30bbf70a399f13b11094c021df0b4b814fde6"} Apr 16 18:31:19.590577 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.590545 2567 generic.go:358] "Generic (PLEG): container finished" podID="712124bd-b639-4596-8f32-8fd7a947a16e" containerID="1ee467ac3a76eac74bbda3bfdeae16f03431fc096d6a10f0cc8d8ffe6be828f4" exitCode=0 Apr 16 18:31:19.590866 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.590619 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerDied","Data":"1ee467ac3a76eac74bbda3bfdeae16f03431fc096d6a10f0cc8d8ffe6be828f4"} Apr 16 18:31:19.591974 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.591948 2567 generic.go:358] "Generic (PLEG): container finished" podID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerID="0525bedeef3f403ed3e34d57b7955c5b458e1b4e35c2090f4f086563f52c15c1" exitCode=0 Apr 16 18:31:19.592101 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.592001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerDied","Data":"0525bedeef3f403ed3e34d57b7955c5b458e1b4e35c2090f4f086563f52c15c1"} Apr 16 18:31:19.730592 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.730570 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:31:19.818991 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.818956 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location\") pod \"712124bd-b639-4596-8f32-8fd7a947a16e\" (UID: \"712124bd-b639-4596-8f32-8fd7a947a16e\") " Apr 16 18:31:19.819360 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.819335 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "712124bd-b639-4596-8f32-8fd7a947a16e" (UID: "712124bd-b639-4596-8f32-8fd7a947a16e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:31:19.919834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:19.919750 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/712124bd-b639-4596-8f32-8fd7a947a16e-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:31:20.597255 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.597223 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" Apr 16 18:31:20.597255 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.597231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn" event={"ID":"712124bd-b639-4596-8f32-8fd7a947a16e","Type":"ContainerDied","Data":"5bf66eae16ae3f540fc2a8346f792c4b298b47173d9826b47357b07ae39dd76e"} Apr 16 18:31:20.597770 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.597285 2567 scope.go:117] "RemoveContainer" containerID="1ee467ac3a76eac74bbda3bfdeae16f03431fc096d6a10f0cc8d8ffe6be828f4" Apr 16 18:31:20.599013 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.598987 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerStarted","Data":"7b35de25485432fb5e1af1b8aa0e616314238214a6604e9a1444543f096a67f7"} Apr 16 18:31:20.609464 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.608860 2567 scope.go:117] "RemoveContainer" containerID="261f16c36163c0f6b1f1cac48420a580ed8901887819b4013b9ba773924b5369" Apr 16 18:31:20.628757 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.628726 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:31:20.637584 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:20.637559 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-8dxwn"] Apr 16 18:31:21.242592 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:21.242561 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" path="/var/lib/kubelet/pods/712124bd-b639-4596-8f32-8fd7a947a16e/volumes" Apr 16 18:31:22.608219 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:22.608176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerStarted","Data":"498a05eaa3b8ee7a73db214ba0ded841d76c2a23c7727adc9a30cbcf3998fab8"} Apr 16 18:31:22.608580 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:22.608402 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:22.657183 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:22.657130 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podStartSLOduration=6.183170353 podStartE2EDuration="8.657115741s" podCreationTimestamp="2026-04-16 18:31:14 +0000 UTC" firstStartedPulling="2026-04-16 18:31:19.667188809 +0000 UTC m=+1241.050852272" lastFinishedPulling="2026-04-16 18:31:22.141134212 +0000 UTC m=+1243.524797660" observedRunningTime="2026-04-16 18:31:22.655155881 +0000 UTC m=+1244.038819351" watchObservedRunningTime="2026-04-16 18:31:22.657115741 +0000 UTC m=+1244.040779209" Apr 16 18:31:23.611422 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:23.611391 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:31:54.618136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:31:54.618101 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:32:24.619472 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:24.619443 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:32:34.889421 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.889388 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:32:34.890356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.890299 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" containerID="cri-o://7b35de25485432fb5e1af1b8aa0e616314238214a6604e9a1444543f096a67f7" gracePeriod=30 Apr 16 18:32:34.890543 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.890353 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-agent" containerID="cri-o://498a05eaa3b8ee7a73db214ba0ded841d76c2a23c7727adc9a30cbcf3998fab8" gracePeriod=30 Apr 16 18:32:34.954559 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954524 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:32:34.954900 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954888 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="kserve-container" Apr 16 18:32:34.954943 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954901 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="kserve-container" Apr 16 18:32:34.954943 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954921 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="storage-initializer" Apr 16 18:32:34.954943 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954927 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="storage-initializer" Apr 16 18:32:34.955037 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.954974 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="712124bd-b639-4596-8f32-8fd7a947a16e" containerName="kserve-container" Apr 16 18:32:34.957305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.957286 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:34.966962 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:34.966933 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:32:35.053245 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.053208 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zmq4x\" (UID: \"08d643f5-b594-497e-b002-af127afbee90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:35.154294 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.154196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zmq4x\" (UID: \"08d643f5-b594-497e-b002-af127afbee90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:35.154605 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.154582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zmq4x\" (UID: \"08d643f5-b594-497e-b002-af127afbee90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:35.269464 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.269438 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:35.395764 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.395738 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:32:35.398716 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:32:35.398681 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d643f5_b594_497e_b002_af127afbee90.slice/crio-43fb8d11932c43fcfc0d6f0a74c634fd94c7da722fd0f7061a3f137ec493ea9b WatchSource:0}: Error finding container 43fb8d11932c43fcfc0d6f0a74c634fd94c7da722fd0f7061a3f137ec493ea9b: Status 404 returned error can't find the container with id 43fb8d11932c43fcfc0d6f0a74c634fd94c7da722fd0f7061a3f137ec493ea9b Apr 16 18:32:35.400994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.400976 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:32:35.859806 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.859770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerStarted","Data":"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2"} Apr 16 18:32:35.859806 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:35.859810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerStarted","Data":"43fb8d11932c43fcfc0d6f0a74c634fd94c7da722fd0f7061a3f137ec493ea9b"} Apr 16 18:32:37.868349 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:37.868317 2567 generic.go:358] "Generic (PLEG): container finished" podID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerID="7b35de25485432fb5e1af1b8aa0e616314238214a6604e9a1444543f096a67f7" exitCode=0 Apr 16 18:32:37.868706 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:37.868384 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerDied","Data":"7b35de25485432fb5e1af1b8aa0e616314238214a6604e9a1444543f096a67f7"} Apr 16 18:32:39.875973 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:39.875936 2567 generic.go:358] "Generic (PLEG): container finished" podID="08d643f5-b594-497e-b002-af127afbee90" containerID="2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2" exitCode=0 Apr 16 18:32:39.876351 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:39.875994 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerDied","Data":"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2"} Apr 16 18:32:44.615761 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:44.615719 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:32:51.926837 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:51.926796 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerStarted","Data":"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357"} Apr 16 18:32:51.927270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:51.927073 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:32:51.928093 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:51.928065 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:32:51.944828 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:51.944778 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podStartSLOduration=6.975708925 podStartE2EDuration="17.94476169s" podCreationTimestamp="2026-04-16 18:32:34 +0000 UTC" firstStartedPulling="2026-04-16 18:32:39.87716624 +0000 UTC m=+1321.260829688" lastFinishedPulling="2026-04-16 18:32:50.846219001 +0000 UTC m=+1332.229882453" observedRunningTime="2026-04-16 18:32:51.943853666 +0000 UTC m=+1333.327517137" watchObservedRunningTime="2026-04-16 18:32:51.94476169 +0000 UTC m=+1333.328425161" Apr 16 18:32:52.930153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:52.930111 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:32:54.615063 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:32:54.614998 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:33:02.930596 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:02.930512 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:33:04.615318 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:04.615277 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:33:04.615693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:04.615394 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:33:04.972433 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:04.972383 2567 generic.go:358] "Generic (PLEG): container finished" podID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerID="498a05eaa3b8ee7a73db214ba0ded841d76c2a23c7727adc9a30cbcf3998fab8" exitCode=0 Apr 16 18:33:04.972583 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:04.972467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerDied","Data":"498a05eaa3b8ee7a73db214ba0ded841d76c2a23c7727adc9a30cbcf3998fab8"} Apr 16 18:33:05.038549 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.038523 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:33:05.124890 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.124851 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location\") pod \"368c76b7-70da-474b-abb4-b3d09dc918f6\" (UID: \"368c76b7-70da-474b-abb4-b3d09dc918f6\") " Apr 16 18:33:05.125218 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.125196 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "368c76b7-70da-474b-abb4-b3d09dc918f6" (UID: "368c76b7-70da-474b-abb4-b3d09dc918f6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:05.226502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.226460 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/368c76b7-70da-474b-abb4-b3d09dc918f6-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:33:05.977721 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.977681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" event={"ID":"368c76b7-70da-474b-abb4-b3d09dc918f6","Type":"ContainerDied","Data":"cb1f6bd4e065fce3908f627fbca30bbf70a399f13b11094c021df0b4b814fde6"} Apr 16 18:33:05.978132 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.977731 2567 scope.go:117] "RemoveContainer" containerID="498a05eaa3b8ee7a73db214ba0ded841d76c2a23c7727adc9a30cbcf3998fab8" Apr 16 18:33:05.978132 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.977786 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp" Apr 16 18:33:05.985541 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.985512 2567 scope.go:117] "RemoveContainer" containerID="7b35de25485432fb5e1af1b8aa0e616314238214a6604e9a1444543f096a67f7" Apr 16 18:33:05.992643 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.992628 2567 scope.go:117] "RemoveContainer" containerID="0525bedeef3f403ed3e34d57b7955c5b458e1b4e35c2090f4f086563f52c15c1" Apr 16 18:33:05.995583 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.995559 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:33:05.998932 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:05.998906 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65779c8984-b6qxp"] Apr 16 18:33:07.241688 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:07.241655 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" path="/var/lib/kubelet/pods/368c76b7-70da-474b-abb4-b3d09dc918f6/volumes" Apr 16 18:33:12.930334 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:12.930292 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:33:22.930982 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:22.930934 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:33:32.931305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:32.931273 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:33:36.507497 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.507462 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:33:36.507951 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.507723 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" containerID="cri-o://0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357" gracePeriod=30 Apr 16 18:33:36.594492 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.594447 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:33:36.594976 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.594954 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-agent" Apr 16 18:33:36.594976 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.594975 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-agent" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.594990 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.594996 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.595011 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="storage-initializer" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.595017 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="storage-initializer" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.595088 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-container" Apr 16 18:33:36.595150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.595098 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="368c76b7-70da-474b-abb4-b3d09dc918f6" containerName="kserve-agent" Apr 16 18:33:36.598019 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.598001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:36.607857 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.607830 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:33:36.684994 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.684958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7q82s\" (UID: \"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:36.786346 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.786257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7q82s\" (UID: \"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:36.786661 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.786636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-7q82s\" (UID: \"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:36.910415 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:36.910373 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:37.032553 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:37.032326 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:33:37.034765 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:33:37.034739 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8aab43_8e4e_4bcf_ae3e_8b8c3d624e84.slice/crio-d283370ff687a02be687c4d0170e5b1dc7ce4d6bef8b40bc3c4f3f7cab901613 WatchSource:0}: Error finding container d283370ff687a02be687c4d0170e5b1dc7ce4d6bef8b40bc3c4f3f7cab901613: Status 404 returned error can't find the container with id d283370ff687a02be687c4d0170e5b1dc7ce4d6bef8b40bc3c4f3f7cab901613 Apr 16 18:33:37.087068 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:37.087025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerStarted","Data":"d283370ff687a02be687c4d0170e5b1dc7ce4d6bef8b40bc3c4f3f7cab901613"} Apr 16 18:33:38.091844 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:38.091802 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerStarted","Data":"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575"} Apr 16 18:33:39.255097 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:39.255067 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:33:39.411084 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:39.410970 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location\") pod \"08d643f5-b594-497e-b002-af127afbee90\" (UID: \"08d643f5-b594-497e-b002-af127afbee90\") " Apr 16 18:33:39.420795 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:39.420767 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08d643f5-b594-497e-b002-af127afbee90" (UID: "08d643f5-b594-497e-b002-af127afbee90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:39.512427 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:39.512386 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08d643f5-b594-497e-b002-af127afbee90-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:33:40.099776 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.099738 2567 generic.go:358] "Generic (PLEG): container finished" podID="08d643f5-b594-497e-b002-af127afbee90" containerID="0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357" exitCode=0 Apr 16 18:33:40.099980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.099806 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerDied","Data":"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357"} Apr 16 18:33:40.099980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.099807 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" Apr 16 18:33:40.099980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.099832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x" event={"ID":"08d643f5-b594-497e-b002-af127afbee90","Type":"ContainerDied","Data":"43fb8d11932c43fcfc0d6f0a74c634fd94c7da722fd0f7061a3f137ec493ea9b"} Apr 16 18:33:40.099980 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.099856 2567 scope.go:117] "RemoveContainer" containerID="0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357" Apr 16 18:33:40.108159 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.108140 2567 scope.go:117] "RemoveContainer" containerID="2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2" Apr 16 18:33:40.114945 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.114928 2567 scope.go:117] "RemoveContainer" containerID="0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357" Apr 16 18:33:40.115234 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:33:40.115210 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357\": container with ID starting with 0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357 not found: ID does not exist" containerID="0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357" Apr 16 18:33:40.115294 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.115249 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357"} err="failed to get container status \"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357\": rpc error: code = NotFound desc = could not find container \"0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357\": container with ID starting with 0696d216e0c3e9026a8e66b77629563d08dc75bdd3828f7fec5358e1febac357 not found: ID does not exist" Apr 16 18:33:40.115294 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.115275 2567 scope.go:117] "RemoveContainer" containerID="2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2" Apr 16 18:33:40.115529 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:33:40.115513 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2\": container with ID starting with 2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2 not found: ID does not exist" containerID="2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2" Apr 16 18:33:40.115578 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.115533 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2"} err="failed to get container status \"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2\": rpc error: code = NotFound desc = could not find container \"2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2\": container with ID starting with 2e390fa37683de9259f1bef7c2495fb9d2818bfe78cc6985bd7d9df8b9628ee2 not found: ID does not exist" Apr 16 18:33:40.121384 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.121362 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:33:40.124507 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:40.124487 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zmq4x"] Apr 16 18:33:41.241775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:41.241735 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d643f5-b594-497e-b002-af127afbee90" path="/var/lib/kubelet/pods/08d643f5-b594-497e-b002-af127afbee90/volumes" Apr 16 18:33:42.110114 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:42.110080 2567 generic.go:358] "Generic (PLEG): container finished" podID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerID="703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575" exitCode=0 Apr 16 18:33:42.110378 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:42.110122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerDied","Data":"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575"} Apr 16 18:33:43.115302 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:43.115269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerStarted","Data":"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf"} Apr 16 18:33:43.115691 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:43.115544 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:33:43.116926 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:43.116871 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:33:43.132383 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:43.132341 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podStartSLOduration=7.132327913 podStartE2EDuration="7.132327913s" podCreationTimestamp="2026-04-16 18:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:43.131647136 +0000 UTC m=+1384.515310607" watchObservedRunningTime="2026-04-16 18:33:43.132327913 +0000 UTC m=+1384.515991385" Apr 16 18:33:44.118534 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:44.118489 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:33:54.118471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:33:54.118427 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:34:04.118522 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:04.118476 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:34:14.118679 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:14.118637 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:34:24.119171 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:24.119115 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:34:34.120391 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:34.120311 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:34:38.134412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.134373 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:34:38.134812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.134658 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" containerID="cri-o://6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf" gracePeriod=30 Apr 16 18:34:38.174789 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.174753 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:34:38.175235 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.175216 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" Apr 16 18:34:38.175332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.175237 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" Apr 16 18:34:38.175332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.175256 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="storage-initializer" Apr 16 18:34:38.175332 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.175264 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="storage-initializer" Apr 16 18:34:38.175481 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.175354 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="08d643f5-b594-497e-b002-af127afbee90" containerName="kserve-container" Apr 16 18:34:38.178312 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.178290 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:38.188595 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.188566 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:34:38.324730 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.324679 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm\" (UID: \"59c8451a-365f-4e57-aeba-840fb0053fd4\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:38.425741 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.425648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm\" (UID: \"59c8451a-365f-4e57-aeba-840fb0053fd4\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:38.426021 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.425998 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm\" (UID: \"59c8451a-365f-4e57-aeba-840fb0053fd4\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:38.492918 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.492886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:38.618140 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:38.618116 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:34:38.620707 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:34:38.620667 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c8451a_365f_4e57_aeba_840fb0053fd4.slice/crio-4562a1282eb818134e78557896ae314df25e35aa56d5732c164958dd56f0789e WatchSource:0}: Error finding container 4562a1282eb818134e78557896ae314df25e35aa56d5732c164958dd56f0789e: Status 404 returned error can't find the container with id 4562a1282eb818134e78557896ae314df25e35aa56d5732c164958dd56f0789e Apr 16 18:34:39.302217 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:39.302180 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerStarted","Data":"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393"} Apr 16 18:34:39.302217 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:39.302221 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerStarted","Data":"4562a1282eb818134e78557896ae314df25e35aa56d5732c164958dd56f0789e"} Apr 16 18:34:40.866968 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:40.866939 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:34:41.047821 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.047790 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location\") pod \"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84\" (UID: \"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84\") " Apr 16 18:34:41.057201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.057173 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" (UID: "da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:41.149410 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.149366 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:34:41.310248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.310153 2567 generic.go:358] "Generic (PLEG): container finished" podID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerID="6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf" exitCode=0 Apr 16 18:34:41.310248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.310191 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerDied","Data":"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf"} Apr 16 18:34:41.310248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.310212 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" event={"ID":"da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84","Type":"ContainerDied","Data":"d283370ff687a02be687c4d0170e5b1dc7ce4d6bef8b40bc3c4f3f7cab901613"} Apr 16 18:34:41.310248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.310222 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s" Apr 16 18:34:41.310563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.310232 2567 scope.go:117] "RemoveContainer" containerID="6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf" Apr 16 18:34:41.318187 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.318171 2567 scope.go:117] "RemoveContainer" containerID="703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575" Apr 16 18:34:41.327074 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.327034 2567 scope.go:117] "RemoveContainer" containerID="6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf" Apr 16 18:34:41.327332 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:34:41.327315 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf\": container with ID starting with 6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf not found: ID does not exist" containerID="6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf" Apr 16 18:34:41.327423 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.327343 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf"} err="failed to get container status \"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf\": rpc error: code = NotFound desc = could not find container \"6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf\": container with ID starting with 6528205dbf7df388102aba3a997db5b5a3525fe250a29a3be28738a9085e9dcf not found: ID does not exist" Apr 16 18:34:41.327423 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.327364 2567 scope.go:117] "RemoveContainer" containerID="703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575" Apr 16 18:34:41.327661 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:34:41.327644 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575\": container with ID starting with 703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575 not found: ID does not exist" containerID="703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575" Apr 16 18:34:41.327700 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.327668 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575"} err="failed to get container status \"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575\": rpc error: code = NotFound desc = could not find container \"703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575\": container with ID starting with 703bd634bf8ba0fa98bf1eccde32eb8c456997d1760747a79f2f3b6d39141575 not found: ID does not exist" Apr 16 18:34:41.328386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.328367 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:34:41.332895 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:41.332877 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-7q82s"] Apr 16 18:34:43.242478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:43.242386 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" path="/var/lib/kubelet/pods/da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84/volumes" Apr 16 18:34:43.319541 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:43.319507 2567 generic.go:358] "Generic (PLEG): container finished" podID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerID="7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393" exitCode=0 Apr 16 18:34:43.319678 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:43.319578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerDied","Data":"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393"} Apr 16 18:34:44.324984 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:44.324951 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerStarted","Data":"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f"} Apr 16 18:34:44.325368 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:44.325252 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:34:44.326679 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:44.326654 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:34:44.344502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:44.344454 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podStartSLOduration=6.344439082 podStartE2EDuration="6.344439082s" podCreationTimestamp="2026-04-16 18:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:44.340965902 +0000 UTC m=+1445.724629375" watchObservedRunningTime="2026-04-16 18:34:44.344439082 +0000 UTC m=+1445.728102553" Apr 16 18:34:45.328446 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:45.328395 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:34:55.328560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:34:55.328521 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:35:05.328585 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:05.328539 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:35:15.328494 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:15.328445 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:35:25.328524 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:25.328483 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:35:35.329322 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:35.329285 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:35:39.912645 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.912607 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:35:39.913008 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.912951 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" containerID="cri-o://fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f" gracePeriod=30 Apr 16 18:35:39.976914 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.976876 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:35:39.977380 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.977357 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" Apr 16 18:35:39.977380 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.977379 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" Apr 16 18:35:39.977579 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.977437 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="storage-initializer" Apr 16 18:35:39.977579 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.977447 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="storage-initializer" Apr 16 18:35:39.977579 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.977550 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="da8aab43-8e4e-4bcf-ae3e-8b8c3d624e84" containerName="kserve-container" Apr 16 18:35:39.981344 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.981328 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:39.994079 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:39.992488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:35:40.142241 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.142191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-pln4c\" (UID: \"de0f776b-5762-4480-9af7-abdfc06a41e6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:40.243292 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.243253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-pln4c\" (UID: \"de0f776b-5762-4480-9af7-abdfc06a41e6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:40.243692 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.243668 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-pln4c\" (UID: \"de0f776b-5762-4480-9af7-abdfc06a41e6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:40.294861 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.294819 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:40.421387 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.421316 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:35:40.424095 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:35:40.424060 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde0f776b_5762_4480_9af7_abdfc06a41e6.slice/crio-22e2145a8bca018d32fd1d755d8ce12dd974eee42db35cb94229239bb7cad396 WatchSource:0}: Error finding container 22e2145a8bca018d32fd1d755d8ce12dd974eee42db35cb94229239bb7cad396: Status 404 returned error can't find the container with id 22e2145a8bca018d32fd1d755d8ce12dd974eee42db35cb94229239bb7cad396 Apr 16 18:35:40.516582 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.516506 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerStarted","Data":"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f"} Apr 16 18:35:40.516582 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:40.516545 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerStarted","Data":"22e2145a8bca018d32fd1d755d8ce12dd974eee42db35cb94229239bb7cad396"} Apr 16 18:35:42.649285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:42.649262 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:35:42.765376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:42.765265 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location\") pod \"59c8451a-365f-4e57-aeba-840fb0053fd4\" (UID: \"59c8451a-365f-4e57-aeba-840fb0053fd4\") " Apr 16 18:35:42.774788 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:42.774759 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59c8451a-365f-4e57-aeba-840fb0053fd4" (UID: "59c8451a-365f-4e57-aeba-840fb0053fd4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:42.866680 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:42.866653 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8451a-365f-4e57-aeba-840fb0053fd4-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:35:43.527136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.527029 2567 generic.go:358] "Generic (PLEG): container finished" podID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerID="fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f" exitCode=0 Apr 16 18:35:43.527136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.527089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerDied","Data":"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f"} Apr 16 18:35:43.527136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.527112 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" Apr 16 18:35:43.527136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.527122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm" event={"ID":"59c8451a-365f-4e57-aeba-840fb0053fd4","Type":"ContainerDied","Data":"4562a1282eb818134e78557896ae314df25e35aa56d5732c164958dd56f0789e"} Apr 16 18:35:43.527408 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.527144 2567 scope.go:117] "RemoveContainer" containerID="fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f" Apr 16 18:35:43.534969 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.534951 2567 scope.go:117] "RemoveContainer" containerID="7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393" Apr 16 18:35:43.541815 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.541793 2567 scope.go:117] "RemoveContainer" containerID="fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f" Apr 16 18:35:43.542081 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:35:43.542055 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f\": container with ID starting with fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f not found: ID does not exist" containerID="fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f" Apr 16 18:35:43.542169 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.542089 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f"} err="failed to get container status \"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f\": rpc error: code = NotFound desc = could not find container \"fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f\": container with ID starting with fc0e9eda1e03f478b0902b95da5f662552187d585b23c8119ac4309ec652994f not found: ID does not exist" Apr 16 18:35:43.542169 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.542107 2567 scope.go:117] "RemoveContainer" containerID="7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393" Apr 16 18:35:43.542346 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:35:43.542328 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393\": container with ID starting with 7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393 not found: ID does not exist" containerID="7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393" Apr 16 18:35:43.542386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.542352 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393"} err="failed to get container status \"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393\": rpc error: code = NotFound desc = could not find container \"7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393\": container with ID starting with 7f8fd8c9c25edad843c3b603313551dcff9a23a6cb7dd822339836fb6058e393 not found: ID does not exist" Apr 16 18:35:43.546918 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.546895 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:35:43.548570 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:43.548549 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-fqvhm"] Apr 16 18:35:44.531517 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:44.531484 2567 generic.go:358] "Generic (PLEG): container finished" podID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerID="685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f" exitCode=0 Apr 16 18:35:44.531935 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:44.531555 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerDied","Data":"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f"} Apr 16 18:35:45.245963 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:45.245536 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" path="/var/lib/kubelet/pods/59c8451a-365f-4e57-aeba-840fb0053fd4/volumes" Apr 16 18:35:51.568325 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:51.568290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerStarted","Data":"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e"} Apr 16 18:35:51.568728 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:51.568562 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:35:51.570099 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:51.570072 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:35:51.589931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:51.589850 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podStartSLOduration=5.797736704 podStartE2EDuration="12.589839648s" podCreationTimestamp="2026-04-16 18:35:39 +0000 UTC" firstStartedPulling="2026-04-16 18:35:44.532750234 +0000 UTC m=+1505.916413684" lastFinishedPulling="2026-04-16 18:35:51.324853177 +0000 UTC m=+1512.708516628" observedRunningTime="2026-04-16 18:35:51.587720667 +0000 UTC m=+1512.971384151" watchObservedRunningTime="2026-04-16 18:35:51.589839648 +0000 UTC m=+1512.973503120" Apr 16 18:35:52.571476 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:35:52.571437 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:02.571659 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:02.571561 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:12.572375 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:12.572330 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:22.572015 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:22.571970 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:32.571931 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:32.571890 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:42.571565 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:42.571525 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:36:52.571904 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:36:52.571858 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:37:02.571825 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:02.571780 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:37:12.573051 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:12.573002 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:37:21.128395 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.128033 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:37:21.128941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.128627 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" containerID="cri-o://7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e" gracePeriod=30 Apr 16 18:37:21.231489 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231454 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:37:21.231880 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231864 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="storage-initializer" Apr 16 18:37:21.231953 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231882 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="storage-initializer" Apr 16 18:37:21.231953 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231908 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" Apr 16 18:37:21.231953 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231914 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" Apr 16 18:37:21.232071 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.231968 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="59c8451a-365f-4e57-aeba-840fb0053fd4" containerName="kserve-container" Apr 16 18:37:21.237267 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.237233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:21.245434 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.245406 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:37:21.396910 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.396816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-pqkq9\" (UID: \"68c90c10-e870-4bb9-9935-4c2e2103fb01\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:21.498010 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.497979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-pqkq9\" (UID: \"68c90c10-e870-4bb9-9935-4c2e2103fb01\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:21.498484 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.498460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-pqkq9\" (UID: \"68c90c10-e870-4bb9-9935-4c2e2103fb01\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:21.551145 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.551111 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:21.681721 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.681697 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:37:21.682988 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:37:21.682958 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c90c10_e870_4bb9_9935_4c2e2103fb01.slice/crio-5ba59775a7e3bb5c71c222c05c1a0757fae0315056894a7e1328f0dccb66d48a WatchSource:0}: Error finding container 5ba59775a7e3bb5c71c222c05c1a0757fae0315056894a7e1328f0dccb66d48a: Status 404 returned error can't find the container with id 5ba59775a7e3bb5c71c222c05c1a0757fae0315056894a7e1328f0dccb66d48a Apr 16 18:37:21.865731 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.865697 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerStarted","Data":"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce"} Apr 16 18:37:21.865731 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:21.865737 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerStarted","Data":"5ba59775a7e3bb5c71c222c05c1a0757fae0315056894a7e1328f0dccb66d48a"} Apr 16 18:37:22.572428 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:22.572382 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:37:24.772887 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.772857 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:37:24.876250 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.876158 2567 generic.go:358] "Generic (PLEG): container finished" podID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerID="7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e" exitCode=0 Apr 16 18:37:24.876250 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.876202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerDied","Data":"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e"} Apr 16 18:37:24.876250 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.876224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" event={"ID":"de0f776b-5762-4480-9af7-abdfc06a41e6","Type":"ContainerDied","Data":"22e2145a8bca018d32fd1d755d8ce12dd974eee42db35cb94229239bb7cad396"} Apr 16 18:37:24.876250 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.876238 2567 scope.go:117] "RemoveContainer" containerID="7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e" Apr 16 18:37:24.876250 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.876239 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c" Apr 16 18:37:24.883888 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.883860 2567 scope.go:117] "RemoveContainer" containerID="685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f" Apr 16 18:37:24.890732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.890716 2567 scope.go:117] "RemoveContainer" containerID="7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e" Apr 16 18:37:24.890981 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:37:24.890960 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e\": container with ID starting with 7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e not found: ID does not exist" containerID="7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e" Apr 16 18:37:24.891090 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.890989 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e"} err="failed to get container status \"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e\": rpc error: code = NotFound desc = could not find container \"7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e\": container with ID starting with 7402f81c7b2684cc0ab0b0ce61e9b7b92f22b87ef4765558272e16334e89000e not found: ID does not exist" Apr 16 18:37:24.891090 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.891006 2567 scope.go:117] "RemoveContainer" containerID="685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f" Apr 16 18:37:24.891341 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:37:24.891255 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f\": container with ID starting with 685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f not found: ID does not exist" containerID="685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f" Apr 16 18:37:24.891341 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.891286 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f"} err="failed to get container status \"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f\": rpc error: code = NotFound desc = could not find container \"685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f\": container with ID starting with 685db18bc380ad835be4d6e0c400e39a2f7241b6bbc2cf88e8b7115bcdcdeb0f not found: ID does not exist" Apr 16 18:37:24.926704 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.926674 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location\") pod \"de0f776b-5762-4480-9af7-abdfc06a41e6\" (UID: \"de0f776b-5762-4480-9af7-abdfc06a41e6\") " Apr 16 18:37:24.927015 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:24.926991 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de0f776b-5762-4480-9af7-abdfc06a41e6" (UID: "de0f776b-5762-4480-9af7-abdfc06a41e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.028180 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.028141 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de0f776b-5762-4480-9af7-abdfc06a41e6-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.199556 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.199525 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:37:25.203120 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.203090 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-pln4c"] Apr 16 18:37:25.242329 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.242294 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" path="/var/lib/kubelet/pods/de0f776b-5762-4480-9af7-abdfc06a41e6/volumes" Apr 16 18:37:25.881037 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.880998 2567 generic.go:358] "Generic (PLEG): container finished" podID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerID="5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce" exitCode=0 Apr 16 18:37:25.881520 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:25.881076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerDied","Data":"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce"} Apr 16 18:37:26.886283 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:26.886253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerStarted","Data":"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392"} Apr 16 18:37:26.886788 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:26.886530 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:37:26.887690 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:26.887666 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:37:26.904031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:26.903979 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podStartSLOduration=5.903965 podStartE2EDuration="5.903965s" podCreationTimestamp="2026-04-16 18:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:26.902415886 +0000 UTC m=+1608.286079349" watchObservedRunningTime="2026-04-16 18:37:26.903965 +0000 UTC m=+1608.287628471" Apr 16 18:37:27.890513 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:27.890476 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:37:37.891343 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:37.891299 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:37:47.891301 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:47.891252 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:37:57.891381 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:37:57.891338 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:07.891425 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:07.891379 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:17.891390 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:17.891344 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:27.890993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:27.890954 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:33.242757 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:33.242715 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:43.238248 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:43.238208 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:38:53.241930 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:38:53.241901 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:39:02.312677 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.312583 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:39:02.313231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.312937 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" containerID="cri-o://00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392" gracePeriod=30 Apr 16 18:39:02.389807 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.389775 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:39:02.390292 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.390275 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" Apr 16 18:39:02.390361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.390295 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" Apr 16 18:39:02.390361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.390311 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="storage-initializer" Apr 16 18:39:02.390361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.390320 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="storage-initializer" Apr 16 18:39:02.390490 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.390417 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="de0f776b-5762-4480-9af7-abdfc06a41e6" containerName="kserve-container" Apr 16 18:39:02.393772 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.393750 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:02.402974 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.402950 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:39:02.436792 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.436762 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx\" (UID: \"9b7a0e65-06e5-4b74-88be-d9b225d58609\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:02.537586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.537550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx\" (UID: \"9b7a0e65-06e5-4b74-88be-d9b225d58609\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:02.537965 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.537944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx\" (UID: \"9b7a0e65-06e5-4b74-88be-d9b225d58609\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:02.705407 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.705375 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:02.825567 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.825540 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:39:02.828318 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:39:02.828285 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b7a0e65_06e5_4b74_88be_d9b225d58609.slice/crio-c501163e371b069ab4fc119a03297d2aa7f3d3720468c12e1473731567da42ba WatchSource:0}: Error finding container c501163e371b069ab4fc119a03297d2aa7f3d3720468c12e1473731567da42ba: Status 404 returned error can't find the container with id c501163e371b069ab4fc119a03297d2aa7f3d3720468c12e1473731567da42ba Apr 16 18:39:02.830239 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:02.830223 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:39:03.202872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:03.202838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerStarted","Data":"b5c574414089149f8570bbce13e3468d60ce40451e132bfcf18734bf4e8928d9"} Apr 16 18:39:03.202872 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:03.202876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerStarted","Data":"c501163e371b069ab4fc119a03297d2aa7f3d3720468c12e1473731567da42ba"} Apr 16 18:39:03.238326 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:03.238280 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:39:05.846445 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:05.846424 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:39:05.964818 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:05.964786 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location\") pod \"68c90c10-e870-4bb9-9935-4c2e2103fb01\" (UID: \"68c90c10-e870-4bb9-9935-4c2e2103fb01\") " Apr 16 18:39:05.965190 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:05.965161 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "68c90c10-e870-4bb9-9935-4c2e2103fb01" (UID: "68c90c10-e870-4bb9-9935-4c2e2103fb01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:06.066095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.066063 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68c90c10-e870-4bb9-9935-4c2e2103fb01-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:39:06.213995 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.213956 2567 generic.go:358] "Generic (PLEG): container finished" podID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerID="00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392" exitCode=0 Apr 16 18:39:06.214197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.214011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerDied","Data":"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392"} Apr 16 18:39:06.214197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.214035 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" Apr 16 18:39:06.214197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.214067 2567 scope.go:117] "RemoveContainer" containerID="00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392" Apr 16 18:39:06.214197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.214057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9" event={"ID":"68c90c10-e870-4bb9-9935-4c2e2103fb01","Type":"ContainerDied","Data":"5ba59775a7e3bb5c71c222c05c1a0757fae0315056894a7e1328f0dccb66d48a"} Apr 16 18:39:06.221944 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.221900 2567 scope.go:117] "RemoveContainer" containerID="5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce" Apr 16 18:39:06.228876 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.228861 2567 scope.go:117] "RemoveContainer" containerID="00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392" Apr 16 18:39:06.229126 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:39:06.229099 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392\": container with ID starting with 00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392 not found: ID does not exist" containerID="00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392" Apr 16 18:39:06.229193 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.229138 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392"} err="failed to get container status \"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392\": rpc error: code = NotFound desc = could not find container \"00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392\": container with ID starting with 00fa7d6d0b21b2fdb6ce8f6d92797705ccb1c250464d22d6e1d2ae5d478e0392 not found: ID does not exist" Apr 16 18:39:06.229193 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.229161 2567 scope.go:117] "RemoveContainer" containerID="5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce" Apr 16 18:39:06.229422 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:39:06.229406 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce\": container with ID starting with 5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce not found: ID does not exist" containerID="5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce" Apr 16 18:39:06.229468 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.229427 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce"} err="failed to get container status \"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce\": rpc error: code = NotFound desc = could not find container \"5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce\": container with ID starting with 5df945ec8a2ed5c8e12d03b3f878376cc0953813af4424ebfcba1d5caa6ed8ce not found: ID does not exist" Apr 16 18:39:06.235958 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.235936 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:39:06.242172 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:06.242153 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-pqkq9"] Apr 16 18:39:07.219354 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:07.219325 2567 generic.go:358] "Generic (PLEG): container finished" podID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerID="b5c574414089149f8570bbce13e3468d60ce40451e132bfcf18734bf4e8928d9" exitCode=0 Apr 16 18:39:07.219755 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:07.219376 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerDied","Data":"b5c574414089149f8570bbce13e3468d60ce40451e132bfcf18734bf4e8928d9"} Apr 16 18:39:07.243053 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:07.243016 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" path="/var/lib/kubelet/pods/68c90c10-e870-4bb9-9935-4c2e2103fb01/volumes" Apr 16 18:39:08.224089 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:08.224031 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerStarted","Data":"0faec92191fd4e7004ac4290078523f0a2ac3ae220c68d46cf7190ceafa0312e"} Apr 16 18:39:08.224657 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:08.224352 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:39:08.225729 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:08.225701 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:08.241389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:08.241339 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podStartSLOduration=6.241324282 podStartE2EDuration="6.241324282s" podCreationTimestamp="2026-04-16 18:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:08.240015163 +0000 UTC m=+1709.623678636" watchObservedRunningTime="2026-04-16 18:39:08.241324282 +0000 UTC m=+1709.624987752" Apr 16 18:39:09.228672 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:09.228635 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:19.229150 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:19.229101 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:29.229095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:29.229029 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:39.228884 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:39.228834 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:49.229684 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:49.229637 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:39:59.229084 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:39:59.229015 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:40:09.228941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:09.228900 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:40:19.228592 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:19.228548 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 18:40:29.230725 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:29.230637 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:40:33.917262 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.917210 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:40:33.917702 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.917578 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" containerID="cri-o://0faec92191fd4e7004ac4290078523f0a2ac3ae220c68d46cf7190ceafa0312e" gracePeriod=30 Apr 16 18:40:33.995326 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995288 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:40:33.995658 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995646 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" Apr 16 18:40:33.995716 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995660 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" Apr 16 18:40:33.995716 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995678 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="storage-initializer" Apr 16 18:40:33.995716 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995685 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="storage-initializer" Apr 16 18:40:33.995820 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.995735 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="68c90c10-e870-4bb9-9935-4c2e2103fb01" containerName="kserve-container" Apr 16 18:40:33.998940 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:33.998919 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:34.008869 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.008836 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:40:34.014933 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.014908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location\") pod \"isvc-primary-4ec3af-predictor-649fb58556-zv8jb\" (UID: \"17917812-738f-419a-8309-657f5ed54d79\") " pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:34.115560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.115524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location\") pod \"isvc-primary-4ec3af-predictor-649fb58556-zv8jb\" (UID: \"17917812-738f-419a-8309-657f5ed54d79\") " pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:34.115920 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.115898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location\") pod \"isvc-primary-4ec3af-predictor-649fb58556-zv8jb\" (UID: \"17917812-738f-419a-8309-657f5ed54d79\") " pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:34.310206 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.310171 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:34.429002 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.428974 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:40:34.431618 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:40:34.431585 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17917812_738f_419a_8309_657f5ed54d79.slice/crio-359504b62775ec7ef137cf36a1224fa0ea98195f20293fffb8efa2c07ed15420 WatchSource:0}: Error finding container 359504b62775ec7ef137cf36a1224fa0ea98195f20293fffb8efa2c07ed15420: Status 404 returned error can't find the container with id 359504b62775ec7ef137cf36a1224fa0ea98195f20293fffb8efa2c07ed15420 Apr 16 18:40:34.511852 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.511810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerStarted","Data":"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a"} Apr 16 18:40:34.511950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:34.511861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerStarted","Data":"359504b62775ec7ef137cf36a1224fa0ea98195f20293fffb8efa2c07ed15420"} Apr 16 18:40:37.523376 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.523334 2567 generic.go:358] "Generic (PLEG): container finished" podID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerID="0faec92191fd4e7004ac4290078523f0a2ac3ae220c68d46cf7190ceafa0312e" exitCode=0 Apr 16 18:40:37.523718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.523394 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerDied","Data":"0faec92191fd4e7004ac4290078523f0a2ac3ae220c68d46cf7190ceafa0312e"} Apr 16 18:40:37.561246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.561226 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:40:37.642557 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.642475 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location\") pod \"9b7a0e65-06e5-4b74-88be-d9b225d58609\" (UID: \"9b7a0e65-06e5-4b74-88be-d9b225d58609\") " Apr 16 18:40:37.642822 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.642804 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b7a0e65-06e5-4b74-88be-d9b225d58609" (UID: "9b7a0e65-06e5-4b74-88be-d9b225d58609"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:37.743933 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:37.743897 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b7a0e65-06e5-4b74-88be-d9b225d58609-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:40:38.527604 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.527574 2567 generic.go:358] "Generic (PLEG): container finished" podID="17917812-738f-419a-8309-657f5ed54d79" containerID="845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a" exitCode=0 Apr 16 18:40:38.528001 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.527652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerDied","Data":"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a"} Apr 16 18:40:38.529095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.529072 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" event={"ID":"9b7a0e65-06e5-4b74-88be-d9b225d58609","Type":"ContainerDied","Data":"c501163e371b069ab4fc119a03297d2aa7f3d3720468c12e1473731567da42ba"} Apr 16 18:40:38.529095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.529089 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx" Apr 16 18:40:38.529253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.529114 2567 scope.go:117] "RemoveContainer" containerID="0faec92191fd4e7004ac4290078523f0a2ac3ae220c68d46cf7190ceafa0312e" Apr 16 18:40:38.537743 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.537728 2567 scope.go:117] "RemoveContainer" containerID="b5c574414089149f8570bbce13e3468d60ce40451e132bfcf18734bf4e8928d9" Apr 16 18:40:38.556563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.556542 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:40:38.560524 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:38.560505 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-pqlxx"] Apr 16 18:40:39.242458 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:39.242422 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" path="/var/lib/kubelet/pods/9b7a0e65-06e5-4b74-88be-d9b225d58609/volumes" Apr 16 18:40:39.533698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:39.533607 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerStarted","Data":"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604"} Apr 16 18:40:39.534167 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:39.534028 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:40:39.535255 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:39.535231 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:40:39.550740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:39.550691 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podStartSLOduration=6.550677121 podStartE2EDuration="6.550677121s" podCreationTimestamp="2026-04-16 18:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:39.54829988 +0000 UTC m=+1800.931963352" watchObservedRunningTime="2026-04-16 18:40:39.550677121 +0000 UTC m=+1800.934340592" Apr 16 18:40:40.538337 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:40.538293 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:40:50.538265 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:40:50.538219 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:00.538511 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:00.538460 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:10.538876 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:10.538824 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:20.539064 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:20.539005 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:30.539197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:30.539143 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:40.538831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:40.538781 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 18:41:45.241462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:45.241434 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:41:54.140112 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140069 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:41:54.140598 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140583 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="storage-initializer" Apr 16 18:41:54.140642 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140601 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="storage-initializer" Apr 16 18:41:54.140642 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140612 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" Apr 16 18:41:54.140642 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140621 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" Apr 16 18:41:54.140741 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.140713 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b7a0e65-06e5-4b74-88be-d9b225d58609" containerName="kserve-container" Apr 16 18:41:54.144078 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.144058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.147111 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.147091 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 18:41:54.147296 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.147274 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-4ec3af-dockercfg-lrrgt\"" Apr 16 18:41:54.148300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.148277 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-4ec3af\"" Apr 16 18:41:54.154191 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.154168 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:41:54.169308 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.169276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.169428 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.169317 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.270513 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.270479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.270513 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.270514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.270858 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.270842 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.271135 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.271119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert\") pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.455543 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.455509 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:41:54.576837 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.576814 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:41:54.579205 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:41:54.579175 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b05e09_c721_4970_925d_723ebb25f038.slice/crio-cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f WatchSource:0}: Error finding container cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f: Status 404 returned error can't find the container with id cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f Apr 16 18:41:54.785514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.785420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerStarted","Data":"f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c"} Apr 16 18:41:54.785514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:54.785467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerStarted","Data":"cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f"} Apr 16 18:41:57.796018 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:57.795991 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/0.log" Apr 16 18:41:57.796495 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:57.796027 2567 generic.go:358] "Generic (PLEG): container finished" podID="07b05e09-c721-4970-925d-723ebb25f038" containerID="f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c" exitCode=1 Apr 16 18:41:57.796495 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:57.796118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerDied","Data":"f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c"} Apr 16 18:41:58.800974 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:58.800946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/0.log" Apr 16 18:41:58.801410 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:41:58.801074 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerStarted","Data":"b3d184b652b074b50ad9c49bef3f04598f2b09bddb06a8757ea502744886d339"} Apr 16 18:42:01.812804 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.812775 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/1.log" Apr 16 18:42:01.813270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.813136 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/0.log" Apr 16 18:42:01.813270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.813164 2567 generic.go:358] "Generic (PLEG): container finished" podID="07b05e09-c721-4970-925d-723ebb25f038" containerID="b3d184b652b074b50ad9c49bef3f04598f2b09bddb06a8757ea502744886d339" exitCode=1 Apr 16 18:42:01.813270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.813245 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerDied","Data":"b3d184b652b074b50ad9c49bef3f04598f2b09bddb06a8757ea502744886d339"} Apr 16 18:42:01.813378 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.813293 2567 scope.go:117] "RemoveContainer" containerID="f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c" Apr 16 18:42:01.813743 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:01.813722 2567 scope.go:117] "RemoveContainer" containerID="f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c" Apr 16 18:42:01.823969 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:42:01.823938 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_kserve-ci-e2e-test_07b05e09-c721-4970-925d-723ebb25f038_0 in pod sandbox cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f from index: no such id: 'f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c'" containerID="f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c" Apr 16 18:42:01.824071 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:42:01.823996 2567 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_kserve-ci-e2e-test_07b05e09-c721-4970-925d-723ebb25f038_0 in pod sandbox cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f from index: no such id: 'f5e4ec589ab6e6a08be0ff1e3439c795df3641c956a1fa81647443158ca02e3c'; Skipping pod \"isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_kserve-ci-e2e-test(07b05e09-c721-4970-925d-723ebb25f038)\"" logger="UnhandledError" Apr 16 18:42:01.825297 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:42:01.825275 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_kserve-ci-e2e-test(07b05e09-c721-4970-925d-723ebb25f038)\"" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" podUID="07b05e09-c721-4970-925d-723ebb25f038" Apr 16 18:42:02.818215 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:02.818188 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/1.log" Apr 16 18:42:08.262858 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.262823 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:42:08.321563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.321526 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:42:08.321912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.321866 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" containerID="cri-o://874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604" gracePeriod=30 Apr 16 18:42:08.368587 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.368557 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:08.383157 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.383112 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:08.383343 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.383260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.386430 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.386404 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-6fabd9-dockercfg-9sxsk\"" Apr 16 18:42:08.387116 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.387095 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-6fabd9\"" Apr 16 18:42:08.413930 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.413899 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/1.log" Apr 16 18:42:08.414106 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.413963 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:42:08.499013 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.498976 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert\") pod \"07b05e09-c721-4970-925d-723ebb25f038\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " Apr 16 18:42:08.499218 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499165 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location\") pod \"07b05e09-c721-4970-925d-723ebb25f038\" (UID: \"07b05e09-c721-4970-925d-723ebb25f038\") " Apr 16 18:42:08.499334 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499311 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "07b05e09-c721-4970-925d-723ebb25f038" (UID: "07b05e09-c721-4970-925d-723ebb25f038"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:42:08.499419 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.499478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499417 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07b05e09-c721-4970-925d-723ebb25f038" (UID: "07b05e09-c721-4970-925d-723ebb25f038"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:08.499544 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.499598 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499585 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07b05e09-c721-4970-925d-723ebb25f038-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:42:08.499650 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.499600 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/07b05e09-c721-4970-925d-723ebb25f038-cabundle-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:42:08.600113 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.599967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.600113 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.600067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.600477 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.600454 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.600707 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.600690 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert\") pod \"isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.710323 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.710291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:08.832928 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.832901 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:08.835777 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:42:08.835749 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4f1e82_ba1f_492d_9d3e_c909ec2d2885.slice/crio-4d45c32a52ef457f5b90ada992e1000a61de47321bc2bb2dfba92d60e009be9a WatchSource:0}: Error finding container 4d45c32a52ef457f5b90ada992e1000a61de47321bc2bb2dfba92d60e009be9a: Status 404 returned error can't find the container with id 4d45c32a52ef457f5b90ada992e1000a61de47321bc2bb2dfba92d60e009be9a Apr 16 18:42:08.838462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.838437 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k_07b05e09-c721-4970-925d-723ebb25f038/storage-initializer/1.log" Apr 16 18:42:08.838575 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.838538 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" event={"ID":"07b05e09-c721-4970-925d-723ebb25f038","Type":"ContainerDied","Data":"cdedf6b59e9f9cf0b23b611d5d7169de953f76167c9884f65b22eb4eded5d25f"} Apr 16 18:42:08.838638 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.838577 2567 scope.go:117] "RemoveContainer" containerID="b3d184b652b074b50ad9c49bef3f04598f2b09bddb06a8757ea502744886d339" Apr 16 18:42:08.838638 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.838602 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k" Apr 16 18:42:08.876797 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.876771 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:42:08.880772 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:08.880749 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4ec3af-predictor-67c74cb697-xrs2k"] Apr 16 18:42:09.243760 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:09.243724 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b05e09-c721-4970-925d-723ebb25f038" path="/var/lib/kubelet/pods/07b05e09-c721-4970-925d-723ebb25f038/volumes" Apr 16 18:42:09.843983 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:09.843944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerStarted","Data":"07db36be161edbedf74b7289086efe5ccb172c9a9a1c6d17ca21128e4f522ec9"} Apr 16 18:42:09.843983 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:09.843988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerStarted","Data":"4d45c32a52ef457f5b90ada992e1000a61de47321bc2bb2dfba92d60e009be9a"} Apr 16 18:42:12.761108 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.761084 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:42:12.835948 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.835860 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location\") pod \"17917812-738f-419a-8309-657f5ed54d79\" (UID: \"17917812-738f-419a-8309-657f5ed54d79\") " Apr 16 18:42:12.836226 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.836207 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17917812-738f-419a-8309-657f5ed54d79" (UID: "17917812-738f-419a-8309-657f5ed54d79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:12.854006 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.853972 2567 generic.go:358] "Generic (PLEG): container finished" podID="17917812-738f-419a-8309-657f5ed54d79" containerID="874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604" exitCode=0 Apr 16 18:42:12.854173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.854062 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerDied","Data":"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604"} Apr 16 18:42:12.854173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.854067 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" Apr 16 18:42:12.854173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.854096 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb" event={"ID":"17917812-738f-419a-8309-657f5ed54d79","Type":"ContainerDied","Data":"359504b62775ec7ef137cf36a1224fa0ea98195f20293fffb8efa2c07ed15420"} Apr 16 18:42:12.854173 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.854115 2567 scope.go:117] "RemoveContainer" containerID="874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604" Apr 16 18:42:12.855527 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.855507 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/0.log" Apr 16 18:42:12.855638 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.855545 2567 generic.go:358] "Generic (PLEG): container finished" podID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerID="07db36be161edbedf74b7289086efe5ccb172c9a9a1c6d17ca21128e4f522ec9" exitCode=1 Apr 16 18:42:12.855638 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.855622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerDied","Data":"07db36be161edbedf74b7289086efe5ccb172c9a9a1c6d17ca21128e4f522ec9"} Apr 16 18:42:12.865607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.865585 2567 scope.go:117] "RemoveContainer" containerID="845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a" Apr 16 18:42:12.872563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.872543 2567 scope.go:117] "RemoveContainer" containerID="874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604" Apr 16 18:42:12.872841 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:42:12.872819 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604\": container with ID starting with 874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604 not found: ID does not exist" containerID="874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604" Apr 16 18:42:12.872940 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.872849 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604"} err="failed to get container status \"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604\": rpc error: code = NotFound desc = could not find container \"874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604\": container with ID starting with 874f0ac01aedb746cbeede6641d7894d92c669e6f68b7ba6234322186fc92604 not found: ID does not exist" Apr 16 18:42:12.872940 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.872867 2567 scope.go:117] "RemoveContainer" containerID="845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a" Apr 16 18:42:12.873109 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:42:12.873092 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a\": container with ID starting with 845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a not found: ID does not exist" containerID="845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a" Apr 16 18:42:12.873157 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.873115 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a"} err="failed to get container status \"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a\": rpc error: code = NotFound desc = could not find container \"845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a\": container with ID starting with 845f26507aa6481102365adba00ff43fb10055d3a508911799a3bd7f41a93c6a not found: ID does not exist" Apr 16 18:42:12.888713 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.888683 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:42:12.890805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.890776 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4ec3af-predictor-649fb58556-zv8jb"] Apr 16 18:42:12.937354 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:12.937329 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17917812-738f-419a-8309-657f5ed54d79-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:42:13.241976 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.241945 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17917812-738f-419a-8309-657f5ed54d79" path="/var/lib/kubelet/pods/17917812-738f-419a-8309-657f5ed54d79/volumes" Apr 16 18:42:13.335286 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.335257 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:13.462771 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.462720 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:42:13.463278 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463256 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="storage-initializer" Apr 16 18:42:13.463377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463279 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="storage-initializer" Apr 16 18:42:13.463377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463291 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" Apr 16 18:42:13.463377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463303 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" Apr 16 18:42:13.463377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463316 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.463377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463326 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.463653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463431 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.463653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463450 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="17917812-738f-419a-8309-657f5ed54d79" containerName="kserve-container" Apr 16 18:42:13.463653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463533 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.463653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463543 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.463653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.463636 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="07b05e09-c721-4970-925d-723ebb25f038" containerName="storage-initializer" Apr 16 18:42:13.467181 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.467155 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:13.469882 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.469860 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ww52g\"" Apr 16 18:42:13.472763 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.472740 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:42:13.542696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.542612 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cnj9g\" (UID: \"79e2cc9c-cc24-4977-8eb9-f36feb284f74\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:13.643281 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.643246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cnj9g\" (UID: \"79e2cc9c-cc24-4977-8eb9-f36feb284f74\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:13.643611 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.643591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-cnj9g\" (UID: \"79e2cc9c-cc24-4977-8eb9-f36feb284f74\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:13.784581 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.784549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:13.862770 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.862742 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/0.log" Apr 16 18:42:13.862905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.862790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerStarted","Data":"dba8958fcba74c12f78671848fc80626611040d3b2cea9368948e111a4e65c82"} Apr 16 18:42:13.863033 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.862995 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" containerID="cri-o://dba8958fcba74c12f78671848fc80626611040d3b2cea9368948e111a4e65c82" gracePeriod=30 Apr 16 18:42:13.910156 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:13.910134 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:42:13.913094 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:42:13.913066 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e2cc9c_cc24_4977_8eb9_f36feb284f74.slice/crio-6f5f7c93b9fb6a25524336ef41287089190a348b94f6eae631159f09ccf1c1d1 WatchSource:0}: Error finding container 6f5f7c93b9fb6a25524336ef41287089190a348b94f6eae631159f09ccf1c1d1: Status 404 returned error can't find the container with id 6f5f7c93b9fb6a25524336ef41287089190a348b94f6eae631159f09ccf1c1d1 Apr 16 18:42:14.867780 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:14.867742 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerStarted","Data":"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241"} Apr 16 18:42:14.868175 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:14.867786 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerStarted","Data":"6f5f7c93b9fb6a25524336ef41287089190a348b94f6eae631159f09ccf1c1d1"} Apr 16 18:42:17.880157 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.880120 2567 generic.go:358] "Generic (PLEG): container finished" podID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerID="b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241" exitCode=0 Apr 16 18:42:17.880546 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.880156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerDied","Data":"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241"} Apr 16 18:42:17.881893 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.881785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/1.log" Apr 16 18:42:17.882221 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.882199 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/0.log" Apr 16 18:42:17.882298 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.882243 2567 generic.go:358] "Generic (PLEG): container finished" podID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerID="dba8958fcba74c12f78671848fc80626611040d3b2cea9368948e111a4e65c82" exitCode=1 Apr 16 18:42:17.882298 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.882287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerDied","Data":"dba8958fcba74c12f78671848fc80626611040d3b2cea9368948e111a4e65c82"} Apr 16 18:42:17.882380 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.882324 2567 scope.go:117] "RemoveContainer" containerID="07db36be161edbedf74b7289086efe5ccb172c9a9a1c6d17ca21128e4f522ec9" Apr 16 18:42:17.912091 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.912071 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/1.log" Apr 16 18:42:17.912174 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.912131 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:17.978696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.978633 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert\") pod \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " Apr 16 18:42:17.978696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.978691 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location\") pod \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\" (UID: \"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885\") " Apr 16 18:42:17.978979 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.978959 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" (UID: "5b4f1e82-ba1f-492d-9d3e-c909ec2d2885"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:42:17.979020 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:17.978965 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" (UID: "5b4f1e82-ba1f-492d-9d3e-c909ec2d2885"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:18.079548 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.079515 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-cabundle-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:42:18.079548 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.079546 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:42:18.888130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.888099 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf_5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/storage-initializer/1.log" Apr 16 18:42:18.888590 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.888212 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" event={"ID":"5b4f1e82-ba1f-492d-9d3e-c909ec2d2885","Type":"ContainerDied","Data":"4d45c32a52ef457f5b90ada992e1000a61de47321bc2bb2dfba92d60e009be9a"} Apr 16 18:42:18.888590 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.888234 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf" Apr 16 18:42:18.888590 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.888246 2567 scope.go:117] "RemoveContainer" containerID="dba8958fcba74c12f78671848fc80626611040d3b2cea9368948e111a4e65c82" Apr 16 18:42:18.922756 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.922722 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:18.926377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:18.926342 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6fabd9-predictor-845d8958cc-v85bf"] Apr 16 18:42:19.243506 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:19.243466 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" path="/var/lib/kubelet/pods/5b4f1e82-ba1f-492d-9d3e-c909ec2d2885/volumes" Apr 16 18:42:38.961261 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:38.961225 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerStarted","Data":"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb"} Apr 16 18:42:38.961674 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:38.961521 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:42:38.962706 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:38.962678 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:42:38.982732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:38.982687 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podStartSLOduration=5.830852149 podStartE2EDuration="25.982673544s" podCreationTimestamp="2026-04-16 18:42:13 +0000 UTC" firstStartedPulling="2026-04-16 18:42:17.881440918 +0000 UTC m=+1899.265104378" lastFinishedPulling="2026-04-16 18:42:38.033262325 +0000 UTC m=+1919.416925773" observedRunningTime="2026-04-16 18:42:38.981266968 +0000 UTC m=+1920.364930442" watchObservedRunningTime="2026-04-16 18:42:38.982673544 +0000 UTC m=+1920.366337016" Apr 16 18:42:39.964854 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:39.964811 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:42:49.965141 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:49.965094 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:42:59.965199 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:42:59.965154 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:09.965098 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:09.965020 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:19.965514 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:19.965460 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:29.965886 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:29.965799 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:39.965880 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:39.965831 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:49.965390 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:49.965344 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:43:57.242191 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:43:57.242165 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:44:03.674718 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.674683 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:44:03.675577 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.675536 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" containerID="cri-o://b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb" gracePeriod=30 Apr 16 18:44:03.761854 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.761819 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:44:03.762171 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762159 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.762225 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762173 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.762267 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762235 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.762267 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762244 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.762339 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762300 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.762339 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.762307 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4f1e82-ba1f-492d-9d3e-c909ec2d2885" containerName="storage-initializer" Apr 16 18:44:03.764305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.764287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:03.777488 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.777449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:44:03.878972 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.878938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn\" (UID: \"47c1cdd3-3761-47ad-99fa-72c12320dbe3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:03.979416 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.979379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn\" (UID: \"47c1cdd3-3761-47ad-99fa-72c12320dbe3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:03.979817 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:03.979797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn\" (UID: \"47c1cdd3-3761-47ad-99fa-72c12320dbe3\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:04.075406 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:04.075357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:04.195024 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:04.194901 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:44:04.197838 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:44:04.197806 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c1cdd3_3761_47ad_99fa_72c12320dbe3.slice/crio-761c4e3472a9d5688ebfc4917c0cfe50a979facce453cf0fd83b5ba86670286d WatchSource:0}: Error finding container 761c4e3472a9d5688ebfc4917c0cfe50a979facce453cf0fd83b5ba86670286d: Status 404 returned error can't find the container with id 761c4e3472a9d5688ebfc4917c0cfe50a979facce453cf0fd83b5ba86670286d Apr 16 18:44:04.199935 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:04.199915 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:44:04.233665 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:04.233570 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerStarted","Data":"761c4e3472a9d5688ebfc4917c0cfe50a979facce453cf0fd83b5ba86670286d"} Apr 16 18:44:05.242300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:05.242270 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerStarted","Data":"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f"} Apr 16 18:44:07.238965 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:07.238922 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 18:44:08.250527 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.250495 2567 generic.go:358] "Generic (PLEG): container finished" podID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerID="a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f" exitCode=0 Apr 16 18:44:08.250957 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.250563 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerDied","Data":"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f"} Apr 16 18:44:08.418720 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.418699 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:44:08.517259 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.517173 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location\") pod \"79e2cc9c-cc24-4977-8eb9-f36feb284f74\" (UID: \"79e2cc9c-cc24-4977-8eb9-f36feb284f74\") " Apr 16 18:44:08.517513 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.517491 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79e2cc9c-cc24-4977-8eb9-f36feb284f74" (UID: "79e2cc9c-cc24-4977-8eb9-f36feb284f74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:08.618397 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:08.618350 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79e2cc9c-cc24-4977-8eb9-f36feb284f74-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:44:09.254750 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.254717 2567 generic.go:358] "Generic (PLEG): container finished" podID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerID="b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb" exitCode=0 Apr 16 18:44:09.255184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.254799 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" Apr 16 18:44:09.255184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.254793 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerDied","Data":"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb"} Apr 16 18:44:09.255184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.254834 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g" event={"ID":"79e2cc9c-cc24-4977-8eb9-f36feb284f74","Type":"ContainerDied","Data":"6f5f7c93b9fb6a25524336ef41287089190a348b94f6eae631159f09ccf1c1d1"} Apr 16 18:44:09.255184 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.254849 2567 scope.go:117] "RemoveContainer" containerID="b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb" Apr 16 18:44:09.256572 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.256536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerStarted","Data":"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676"} Apr 16 18:44:09.257031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.257009 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:44:09.258383 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.258352 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:44:09.263032 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.263019 2567 scope.go:117] "RemoveContainer" containerID="b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241" Apr 16 18:44:09.269974 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.269956 2567 scope.go:117] "RemoveContainer" containerID="b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb" Apr 16 18:44:09.270318 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:44:09.270299 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb\": container with ID starting with b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb not found: ID does not exist" containerID="b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb" Apr 16 18:44:09.270374 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.270327 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb"} err="failed to get container status \"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb\": rpc error: code = NotFound desc = could not find container \"b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb\": container with ID starting with b3b930207b0f8720814af0c62e51d07717d9809b8366e97c06406ad1384b34fb not found: ID does not exist" Apr 16 18:44:09.270374 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.270345 2567 scope.go:117] "RemoveContainer" containerID="b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241" Apr 16 18:44:09.270542 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:44:09.270528 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241\": container with ID starting with b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241 not found: ID does not exist" containerID="b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241" Apr 16 18:44:09.270594 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.270545 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241"} err="failed to get container status \"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241\": rpc error: code = NotFound desc = could not find container \"b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241\": container with ID starting with b57649bebd00446f22a5506e5760454cccd0f27237cbe918ad2ca6fb93813241 not found: ID does not exist" Apr 16 18:44:09.275773 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.275733 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podStartSLOduration=6.275723316 podStartE2EDuration="6.275723316s" podCreationTimestamp="2026-04-16 18:44:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:09.27300568 +0000 UTC m=+2010.656669163" watchObservedRunningTime="2026-04-16 18:44:09.275723316 +0000 UTC m=+2010.659386820" Apr 16 18:44:09.285803 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.285776 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:44:09.288619 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:09.288598 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-cnj9g"] Apr 16 18:44:10.260627 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:10.260588 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:44:11.241803 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:11.241772 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" path="/var/lib/kubelet/pods/79e2cc9c-cc24-4977-8eb9-f36feb284f74/volumes" Apr 16 18:44:20.261320 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:20.261275 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:44:30.260927 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:30.260883 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:44:40.261356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:40.261315 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:44:50.261510 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:44:50.261475 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:45:00.260649 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:00.260568 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:45:10.261062 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:10.261010 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:45:20.261153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:20.261099 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 18:45:30.261246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:30.261214 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:45:33.876601 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.876563 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:45:33.877068 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.876880 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" containerID="cri-o://7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676" gracePeriod=30 Apr 16 18:45:33.949411 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949371 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:45:33.949856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949836 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" Apr 16 18:45:33.949856 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949857 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" Apr 16 18:45:33.949977 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949870 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="storage-initializer" Apr 16 18:45:33.949977 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949876 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="storage-initializer" Apr 16 18:45:33.949977 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.949945 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="79e2cc9c-cc24-4977-8eb9-f36feb284f74" containerName="kserve-container" Apr 16 18:45:33.953113 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.953095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:33.959950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:33.959915 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:45:34.040120 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.040079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-82q5z\" (UID: \"03ad0409-2241-4df3-b287-2c7b36e9743e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:34.140788 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.140698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-82q5z\" (UID: \"03ad0409-2241-4df3-b287-2c7b36e9743e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:34.141097 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.141077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-82q5z\" (UID: \"03ad0409-2241-4df3-b287-2c7b36e9743e\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:34.266066 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.266007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:34.395060 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.395017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:45:34.398323 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:45:34.398272 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ad0409_2241_4df3_b287_2c7b36e9743e.slice/crio-3b236a661d686e74eaa5045f95e62060a7d78036d18c54d0e43597c26af1681a WatchSource:0}: Error finding container 3b236a661d686e74eaa5045f95e62060a7d78036d18c54d0e43597c26af1681a: Status 404 returned error can't find the container with id 3b236a661d686e74eaa5045f95e62060a7d78036d18c54d0e43597c26af1681a Apr 16 18:45:34.537806 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.537772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerStarted","Data":"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f"} Apr 16 18:45:34.537806 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:34.537813 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerStarted","Data":"3b236a661d686e74eaa5045f95e62060a7d78036d18c54d0e43597c26af1681a"} Apr 16 18:45:38.527990 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.527966 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:45:38.552640 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.552607 2567 generic.go:358] "Generic (PLEG): container finished" podID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerID="096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f" exitCode=0 Apr 16 18:45:38.552766 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.552641 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerDied","Data":"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f"} Apr 16 18:45:38.554276 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.554248 2567 generic.go:358] "Generic (PLEG): container finished" podID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerID="7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676" exitCode=0 Apr 16 18:45:38.554393 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.554381 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" Apr 16 18:45:38.554446 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.554392 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerDied","Data":"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676"} Apr 16 18:45:38.554446 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.554425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn" event={"ID":"47c1cdd3-3761-47ad-99fa-72c12320dbe3","Type":"ContainerDied","Data":"761c4e3472a9d5688ebfc4917c0cfe50a979facce453cf0fd83b5ba86670286d"} Apr 16 18:45:38.554533 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.554452 2567 scope.go:117] "RemoveContainer" containerID="7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676" Apr 16 18:45:38.562929 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.562911 2567 scope.go:117] "RemoveContainer" containerID="a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f" Apr 16 18:45:38.571918 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.571897 2567 scope.go:117] "RemoveContainer" containerID="7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676" Apr 16 18:45:38.572252 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:45:38.572226 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676\": container with ID starting with 7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676 not found: ID does not exist" containerID="7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676" Apr 16 18:45:38.572350 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.572259 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676"} err="failed to get container status \"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676\": rpc error: code = NotFound desc = could not find container \"7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676\": container with ID starting with 7f7633bd6611850d2a420de39c4d4897addc0a04de9610773479bcec670b2676 not found: ID does not exist" Apr 16 18:45:38.572350 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.572281 2567 scope.go:117] "RemoveContainer" containerID="a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f" Apr 16 18:45:38.572542 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:45:38.572523 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f\": container with ID starting with a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f not found: ID does not exist" containerID="a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f" Apr 16 18:45:38.572601 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.572561 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f"} err="failed to get container status \"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f\": rpc error: code = NotFound desc = could not find container \"a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f\": container with ID starting with a2b254d0dc3a57f05dab01227821571d9d747a572b69b68e25666a1da876f93f not found: ID does not exist" Apr 16 18:45:38.576937 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.576918 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location\") pod \"47c1cdd3-3761-47ad-99fa-72c12320dbe3\" (UID: \"47c1cdd3-3761-47ad-99fa-72c12320dbe3\") " Apr 16 18:45:38.577282 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.577260 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "47c1cdd3-3761-47ad-99fa-72c12320dbe3" (UID: "47c1cdd3-3761-47ad-99fa-72c12320dbe3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:38.677891 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.677865 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/47c1cdd3-3761-47ad-99fa-72c12320dbe3-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:45:38.877744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.877708 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:45:38.879422 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:38.879401 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-5zsdn"] Apr 16 18:45:39.241693 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:39.241661 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" path="/var/lib/kubelet/pods/47c1cdd3-3761-47ad-99fa-72c12320dbe3/volumes" Apr 16 18:45:39.558684 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:39.558600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerStarted","Data":"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5"} Apr 16 18:45:39.559069 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:39.558899 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:45:39.560276 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:39.560251 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:45:39.574925 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:39.574881 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podStartSLOduration=6.574867455 podStartE2EDuration="6.574867455s" podCreationTimestamp="2026-04-16 18:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:45:39.573647771 +0000 UTC m=+2100.957311242" watchObservedRunningTime="2026-04-16 18:45:39.574867455 +0000 UTC m=+2100.958530925" Apr 16 18:45:40.563578 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:40.563541 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:45:50.564066 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:45:50.563999 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:00.564486 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:00.564438 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:10.564414 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:10.564365 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:20.564130 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:20.564003 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:30.563857 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:30.563769 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:40.563648 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:40.563604 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:42.237934 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:42.237883 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 18:46:52.239253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:52.239215 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:46:54.072157 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.072121 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:46:54.072516 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.072391 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" containerID="cri-o://284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5" gracePeriod=30 Apr 16 18:46:54.181854 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.181819 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:46:54.182186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.182174 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="storage-initializer" Apr 16 18:46:54.182242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.182188 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="storage-initializer" Apr 16 18:46:54.182242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.182198 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" Apr 16 18:46:54.182242 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.182205 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" Apr 16 18:46:54.182335 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.182264 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c1cdd3-3761-47ad-99fa-72c12320dbe3" containerName="kserve-container" Apr 16 18:46:54.185656 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.185626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:54.196315 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.196083 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:46:54.262777 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.262746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7\" (UID: \"9eec8f00-e42b-4fce-840c-5bed3998b2e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:54.363418 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.363341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7\" (UID: \"9eec8f00-e42b-4fce-840c-5bed3998b2e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:54.363753 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.363733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7\" (UID: \"9eec8f00-e42b-4fce-840c-5bed3998b2e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:54.497586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.497557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:54.619253 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.619161 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:46:54.622141 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:46:54.622110 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eec8f00_e42b_4fce_840c_5bed3998b2e6.slice/crio-26fcfb65574169fc32ae6ecb7449552507978557c06608babde3c656e3799b7d WatchSource:0}: Error finding container 26fcfb65574169fc32ae6ecb7449552507978557c06608babde3c656e3799b7d: Status 404 returned error can't find the container with id 26fcfb65574169fc32ae6ecb7449552507978557c06608babde3c656e3799b7d Apr 16 18:46:54.813834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.813787 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerStarted","Data":"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c"} Apr 16 18:46:54.813834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:54.813832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerStarted","Data":"26fcfb65574169fc32ae6ecb7449552507978557c06608babde3c656e3799b7d"} Apr 16 18:46:58.715117 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.715094 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:46:58.800507 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.800413 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location\") pod \"03ad0409-2241-4df3-b287-2c7b36e9743e\" (UID: \"03ad0409-2241-4df3-b287-2c7b36e9743e\") " Apr 16 18:46:58.800757 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.800735 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "03ad0409-2241-4df3-b287-2c7b36e9743e" (UID: "03ad0409-2241-4df3-b287-2c7b36e9743e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:58.828369 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.828340 2567 generic.go:358] "Generic (PLEG): container finished" podID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerID="284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5" exitCode=0 Apr 16 18:46:58.828536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.828399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerDied","Data":"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5"} Apr 16 18:46:58.828536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.828408 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" Apr 16 18:46:58.828536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.828432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z" event={"ID":"03ad0409-2241-4df3-b287-2c7b36e9743e","Type":"ContainerDied","Data":"3b236a661d686e74eaa5045f95e62060a7d78036d18c54d0e43597c26af1681a"} Apr 16 18:46:58.828536 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.828459 2567 scope.go:117] "RemoveContainer" containerID="284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5" Apr 16 18:46:58.829879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.829856 2567 generic.go:358] "Generic (PLEG): container finished" podID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerID="9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c" exitCode=0 Apr 16 18:46:58.830013 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.829918 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerDied","Data":"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c"} Apr 16 18:46:58.840354 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.840338 2567 scope.go:117] "RemoveContainer" containerID="096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f" Apr 16 18:46:58.849229 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.849211 2567 scope.go:117] "RemoveContainer" containerID="284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5" Apr 16 18:46:58.849503 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:46:58.849484 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5\": container with ID starting with 284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5 not found: ID does not exist" containerID="284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5" Apr 16 18:46:58.849557 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.849511 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5"} err="failed to get container status \"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5\": rpc error: code = NotFound desc = could not find container \"284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5\": container with ID starting with 284be128693874c4cd90f20645ce5132e9d38306055871f98aab399268cfb6b5 not found: ID does not exist" Apr 16 18:46:58.849557 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.849532 2567 scope.go:117] "RemoveContainer" containerID="096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f" Apr 16 18:46:58.849792 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:46:58.849760 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f\": container with ID starting with 096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f not found: ID does not exist" containerID="096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f" Apr 16 18:46:58.849847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.849795 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f"} err="failed to get container status \"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f\": rpc error: code = NotFound desc = could not find container \"096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f\": container with ID starting with 096bf8d2b720aceaeda17b7ea02711d2682505e87a71beb503266da676e6602f not found: ID does not exist" Apr 16 18:46:58.862231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.862206 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:46:58.866222 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.866201 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-82q5z"] Apr 16 18:46:58.901432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:58.901406 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03ad0409-2241-4df3-b287-2c7b36e9743e-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:46:59.241868 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:59.241838 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" path="/var/lib/kubelet/pods/03ad0409-2241-4df3-b287-2c7b36e9743e/volumes" Apr 16 18:46:59.835839 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:59.835805 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerStarted","Data":"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875"} Apr 16 18:46:59.836229 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:59.836014 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:46:59.854842 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:46:59.854795 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podStartSLOduration=5.854781215 podStartE2EDuration="5.854781215s" podCreationTimestamp="2026-04-16 18:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:46:59.851928349 +0000 UTC m=+2181.235591822" watchObservedRunningTime="2026-04-16 18:46:59.854781215 +0000 UTC m=+2181.238444686" Apr 16 18:47:30.840834 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:47:30.840786 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 18:47:40.840668 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:47:40.840620 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 18:47:50.840663 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:47:50.840620 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 18:48:00.840526 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:00.840443 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 18:48:10.237902 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:10.237852 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 18:48:20.241462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:20.241428 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:48:24.319386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.319347 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:48:24.319753 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.319717 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" containerID="cri-o://50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875" gracePeriod=30 Apr 16 18:48:24.411687 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.411651 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:48:24.412089 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.412073 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" Apr 16 18:48:24.412186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.412092 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" Apr 16 18:48:24.412186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.412105 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="storage-initializer" Apr 16 18:48:24.412186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.412113 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="storage-initializer" Apr 16 18:48:24.412342 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.412207 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="03ad0409-2241-4df3-b287-2c7b36e9743e" containerName="kserve-container" Apr 16 18:48:24.415505 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.415483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:24.423908 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.423883 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:48:24.511698 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.511669 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b\" (UID: \"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:24.612795 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.612702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b\" (UID: \"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:24.613140 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.613110 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b\" (UID: \"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:24.727521 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.727486 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:24.849004 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:24.848979 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:48:24.851470 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:48:24.851438 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd7f3c1_92dc_4d1c_98ef_3280fcadac79.slice/crio-c835b995ae832e623ad0d2ae2a9ef37f0d6e276bb9c5ac098312cdf0b0fd84ca WatchSource:0}: Error finding container c835b995ae832e623ad0d2ae2a9ef37f0d6e276bb9c5ac098312cdf0b0fd84ca: Status 404 returned error can't find the container with id c835b995ae832e623ad0d2ae2a9ef37f0d6e276bb9c5ac098312cdf0b0fd84ca Apr 16 18:48:25.126854 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:25.126762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerStarted","Data":"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367"} Apr 16 18:48:25.126854 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:25.126815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerStarted","Data":"c835b995ae832e623ad0d2ae2a9ef37f0d6e276bb9c5ac098312cdf0b0fd84ca"} Apr 16 18:48:29.062090 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.062066 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:48:29.140794 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.140749 2567 generic.go:358] "Generic (PLEG): container finished" podID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerID="50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875" exitCode=0 Apr 16 18:48:29.140993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.140821 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" Apr 16 18:48:29.140993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.140836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerDied","Data":"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875"} Apr 16 18:48:29.140993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.140890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7" event={"ID":"9eec8f00-e42b-4fce-840c-5bed3998b2e6","Type":"ContainerDied","Data":"26fcfb65574169fc32ae6ecb7449552507978557c06608babde3c656e3799b7d"} Apr 16 18:48:29.140993 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.140913 2567 scope.go:117] "RemoveContainer" containerID="50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875" Apr 16 18:48:29.142206 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.142187 2567 generic.go:358] "Generic (PLEG): container finished" podID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerID="6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367" exitCode=0 Apr 16 18:48:29.142310 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.142259 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerDied","Data":"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367"} Apr 16 18:48:29.148740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.148719 2567 scope.go:117] "RemoveContainer" containerID="9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c" Apr 16 18:48:29.151072 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.151031 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location\") pod \"9eec8f00-e42b-4fce-840c-5bed3998b2e6\" (UID: \"9eec8f00-e42b-4fce-840c-5bed3998b2e6\") " Apr 16 18:48:29.151286 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.151263 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9eec8f00-e42b-4fce-840c-5bed3998b2e6" (UID: "9eec8f00-e42b-4fce-840c-5bed3998b2e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:29.156800 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.156784 2567 scope.go:117] "RemoveContainer" containerID="50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875" Apr 16 18:48:29.157030 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:48:29.157011 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875\": container with ID starting with 50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875 not found: ID does not exist" containerID="50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875" Apr 16 18:48:29.157115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.157036 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875"} err="failed to get container status \"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875\": rpc error: code = NotFound desc = could not find container \"50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875\": container with ID starting with 50c611e8df91e40454b8c2bd38b3e8c59e13024d03464875f86cd511ee4d9875 not found: ID does not exist" Apr 16 18:48:29.157115 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.157079 2567 scope.go:117] "RemoveContainer" containerID="9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c" Apr 16 18:48:29.157339 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:48:29.157322 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c\": container with ID starting with 9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c not found: ID does not exist" containerID="9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c" Apr 16 18:48:29.157387 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.157345 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c"} err="failed to get container status \"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c\": rpc error: code = NotFound desc = could not find container \"9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c\": container with ID starting with 9d721a8b616faa4e9ce9807225c40175d5339eb0e5d5c67dcc4e24869690406c not found: ID does not exist" Apr 16 18:48:29.252372 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.252348 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9eec8f00-e42b-4fce-840c-5bed3998b2e6-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:48:29.457541 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.457505 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:48:29.461167 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:29.461144 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-b8fv7"] Apr 16 18:48:30.147479 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:30.147442 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerStarted","Data":"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193"} Apr 16 18:48:30.147962 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:30.147671 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:48:30.164258 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:30.164198 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podStartSLOduration=6.164184919 podStartE2EDuration="6.164184919s" podCreationTimestamp="2026-04-16 18:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:48:30.163343941 +0000 UTC m=+2271.547007415" watchObservedRunningTime="2026-04-16 18:48:30.164184919 +0000 UTC m=+2271.547848389" Apr 16 18:48:31.242361 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:48:31.242330 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" path="/var/lib/kubelet/pods/9eec8f00-e42b-4fce-840c-5bed3998b2e6/volumes" Apr 16 18:49:01.152613 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:01.152568 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 18:49:11.151188 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:11.151148 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 18:49:21.151394 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:21.151351 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 18:49:31.151260 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:31.151215 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 18:49:41.151329 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:41.151288 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 18:49:51.154601 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:51.154571 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:49:54.552607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.552568 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:49:54.553030 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.552916 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" containerID="cri-o://651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193" gracePeriod=30 Apr 16 18:49:54.628128 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628094 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:49:54.628430 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628418 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="storage-initializer" Apr 16 18:49:54.628478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628432 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="storage-initializer" Apr 16 18:49:54.628478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628446 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" Apr 16 18:49:54.628478 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628451 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" Apr 16 18:49:54.628577 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.628517 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eec8f00-e42b-4fce-840c-5bed3998b2e6" containerName="kserve-container" Apr 16 18:49:54.631769 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.631745 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:49:54.639781 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.639756 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:49:54.666382 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.666352 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4\" (UID: \"f3af865b-8cd0-46be-801a-a019b09e1db3\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:49:54.767174 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.767136 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4\" (UID: \"f3af865b-8cd0-46be-801a-a019b09e1db3\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:49:54.767603 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.767581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4\" (UID: \"f3af865b-8cd0-46be-801a-a019b09e1db3\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:49:54.943709 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:54.943677 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:49:55.071381 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:55.071310 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:49:55.073899 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:49:55.073869 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3af865b_8cd0_46be_801a_a019b09e1db3.slice/crio-ae15703bb53361605b86847168664abffdd54706b7e57274d50de9a3b883dc48 WatchSource:0}: Error finding container ae15703bb53361605b86847168664abffdd54706b7e57274d50de9a3b883dc48: Status 404 returned error can't find the container with id ae15703bb53361605b86847168664abffdd54706b7e57274d50de9a3b883dc48 Apr 16 18:49:55.075711 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:55.075692 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:49:55.424350 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:55.424263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerStarted","Data":"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820"} Apr 16 18:49:55.424350 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:55.424303 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerStarted","Data":"ae15703bb53361605b86847168664abffdd54706b7e57274d50de9a3b883dc48"} Apr 16 18:49:59.191448 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.191423 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:49:59.305136 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.305099 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location\") pod \"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79\" (UID: \"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79\") " Apr 16 18:49:59.305469 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.305443 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" (UID: "0fd7f3c1-92dc-4d1c-98ef-3280fcadac79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:59.305594 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.305577 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:49:59.438294 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.438254 2567 generic.go:358] "Generic (PLEG): container finished" podID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerID="651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193" exitCode=0 Apr 16 18:49:59.438471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.438328 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" Apr 16 18:49:59.438471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.438339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerDied","Data":"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193"} Apr 16 18:49:59.438471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.438381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b" event={"ID":"0fd7f3c1-92dc-4d1c-98ef-3280fcadac79","Type":"ContainerDied","Data":"c835b995ae832e623ad0d2ae2a9ef37f0d6e276bb9c5ac098312cdf0b0fd84ca"} Apr 16 18:49:59.438471 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.438405 2567 scope.go:117] "RemoveContainer" containerID="651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193" Apr 16 18:49:59.439752 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.439729 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerID="1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820" exitCode=0 Apr 16 18:49:59.439888 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.439784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerDied","Data":"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820"} Apr 16 18:49:59.448897 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.448871 2567 scope.go:117] "RemoveContainer" containerID="6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367" Apr 16 18:49:59.456079 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.456061 2567 scope.go:117] "RemoveContainer" containerID="651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193" Apr 16 18:49:59.456312 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:49:59.456294 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193\": container with ID starting with 651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193 not found: ID does not exist" containerID="651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193" Apr 16 18:49:59.456355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.456320 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193"} err="failed to get container status \"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193\": rpc error: code = NotFound desc = could not find container \"651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193\": container with ID starting with 651771dacc78f10caa9fc9c9556e19dba37e533d48658d8d5c9ed512f66dd193 not found: ID does not exist" Apr 16 18:49:59.456355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.456339 2567 scope.go:117] "RemoveContainer" containerID="6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367" Apr 16 18:49:59.456557 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:49:59.456542 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367\": container with ID starting with 6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367 not found: ID does not exist" containerID="6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367" Apr 16 18:49:59.456597 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.456560 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367"} err="failed to get container status \"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367\": rpc error: code = NotFound desc = could not find container \"6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367\": container with ID starting with 6ff9806c5b1005131ef2ae5ca622e215175a575625b00e6b5f30361c43b92367 not found: ID does not exist" Apr 16 18:49:59.474768 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.474747 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:49:59.480431 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:49:59.480410 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-jlb9b"] Apr 16 18:50:00.445223 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:00.445192 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerStarted","Data":"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a"} Apr 16 18:50:00.445622 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:00.445401 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:50:00.464014 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:00.463951 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podStartSLOduration=6.463937649 podStartE2EDuration="6.463937649s" podCreationTimestamp="2026-04-16 18:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:50:00.462367553 +0000 UTC m=+2361.846031024" watchObservedRunningTime="2026-04-16 18:50:00.463937649 +0000 UTC m=+2361.847601120" Apr 16 18:50:01.242278 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:01.242241 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" path="/var/lib/kubelet/pods/0fd7f3c1-92dc-4d1c-98ef-3280fcadac79/volumes" Apr 16 18:50:31.450957 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:31.450910 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 18:50:41.449333 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:41.449283 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 18:50:51.450018 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:50:51.449970 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 18:51:01.449838 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:01.449745 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 18:51:04.237305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:04.237266 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 18:51:14.241006 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:14.240976 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:51:14.726137 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:14.726104 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:51:14.726541 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:14.726479 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" containerID="cri-o://5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a" gracePeriod=30 Apr 16 18:51:16.908623 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.908588 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:51:16.909108 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.909093 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" Apr 16 18:51:16.909165 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.909111 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" Apr 16 18:51:16.909165 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.909140 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="storage-initializer" Apr 16 18:51:16.909165 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.909148 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="storage-initializer" Apr 16 18:51:16.909257 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.909224 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fd7f3c1-92dc-4d1c-98ef-3280fcadac79" containerName="kserve-container" Apr 16 18:51:16.912527 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.912495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:16.923462 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:16.923434 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:51:17.066096 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.066037 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-msrkr\" (UID: \"bde7611b-263b-43ca-ba04-e19d4ab45349\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:17.167270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.167185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-msrkr\" (UID: \"bde7611b-263b-43ca-ba04-e19d4ab45349\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:17.167672 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.167652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location\") pod \"isvc-sklearn-predictor-67bb8f7cbc-msrkr\" (UID: \"bde7611b-263b-43ca-ba04-e19d4ab45349\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:17.224625 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.224590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:17.345814 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.345786 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:51:17.348661 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:51:17.348631 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde7611b_263b_43ca_ba04_e19d4ab45349.slice/crio-77c1b51376c56638fd2145e5f940bf87363f93fa480a653d44b73595d31086a2 WatchSource:0}: Error finding container 77c1b51376c56638fd2145e5f940bf87363f93fa480a653d44b73595d31086a2: Status 404 returned error can't find the container with id 77c1b51376c56638fd2145e5f940bf87363f93fa480a653d44b73595d31086a2 Apr 16 18:51:17.709816 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.709779 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerStarted","Data":"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc"} Apr 16 18:51:17.709816 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:17.709815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerStarted","Data":"77c1b51376c56638fd2145e5f940bf87363f93fa480a653d44b73595d31086a2"} Apr 16 18:51:19.570918 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.570894 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:51:19.588374 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.588348 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location\") pod \"f3af865b-8cd0-46be-801a-a019b09e1db3\" (UID: \"f3af865b-8cd0-46be-801a-a019b09e1db3\") " Apr 16 18:51:19.588710 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.588683 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f3af865b-8cd0-46be-801a-a019b09e1db3" (UID: "f3af865b-8cd0-46be-801a-a019b09e1db3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:19.689249 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.689219 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3af865b-8cd0-46be-801a-a019b09e1db3-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:51:19.717273 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.717237 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerID="5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a" exitCode=0 Apr 16 18:51:19.717412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.717312 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" Apr 16 18:51:19.717412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.717326 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerDied","Data":"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a"} Apr 16 18:51:19.717412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.717367 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4" event={"ID":"f3af865b-8cd0-46be-801a-a019b09e1db3","Type":"ContainerDied","Data":"ae15703bb53361605b86847168664abffdd54706b7e57274d50de9a3b883dc48"} Apr 16 18:51:19.717412 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.717384 2567 scope.go:117] "RemoveContainer" containerID="5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a" Apr 16 18:51:19.725522 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.725295 2567 scope.go:117] "RemoveContainer" containerID="1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820" Apr 16 18:51:19.732609 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.732588 2567 scope.go:117] "RemoveContainer" containerID="5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a" Apr 16 18:51:19.732848 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:51:19.732826 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a\": container with ID starting with 5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a not found: ID does not exist" containerID="5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a" Apr 16 18:51:19.732926 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.732860 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a"} err="failed to get container status \"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a\": rpc error: code = NotFound desc = could not find container \"5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a\": container with ID starting with 5c0eea68ac075f32a2c110582a2492a7ce016727a0819d29e2af16032c62523a not found: ID does not exist" Apr 16 18:51:19.732926 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.732884 2567 scope.go:117] "RemoveContainer" containerID="1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820" Apr 16 18:51:19.733144 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:51:19.733128 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820\": container with ID starting with 1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820 not found: ID does not exist" containerID="1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820" Apr 16 18:51:19.733206 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.733153 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820"} err="failed to get container status \"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820\": rpc error: code = NotFound desc = could not find container \"1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820\": container with ID starting with 1c6437f794bf3b95b6afadf89a1aef3cbe6fcf8f3ef5d810e11f8f35ed4b0820 not found: ID does not exist" Apr 16 18:51:19.739803 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.739781 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:51:19.743177 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:19.743156 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-9m6k4"] Apr 16 18:51:21.241634 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:21.241554 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" path="/var/lib/kubelet/pods/f3af865b-8cd0-46be-801a-a019b09e1db3/volumes" Apr 16 18:51:21.725519 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:21.725475 2567 generic.go:358] "Generic (PLEG): container finished" podID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerID="37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc" exitCode=0 Apr 16 18:51:21.725694 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:21.725548 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerDied","Data":"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc"} Apr 16 18:51:22.730696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:22.730662 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerStarted","Data":"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955"} Apr 16 18:51:22.731160 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:22.730990 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:51:22.732432 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:22.732403 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:51:22.749182 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:22.749137 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podStartSLOduration=6.749122358 podStartE2EDuration="6.749122358s" podCreationTimestamp="2026-04-16 18:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:22.746936354 +0000 UTC m=+2444.130599836" watchObservedRunningTime="2026-04-16 18:51:22.749122358 +0000 UTC m=+2444.132785829" Apr 16 18:51:23.733825 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:23.733788 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:51:33.734812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:33.734767 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:51:43.734527 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:43.734477 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:51:53.734429 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:51:53.734383 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:52:03.734296 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:03.734249 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:52:13.733941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:13.733889 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:52:23.734145 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:23.734102 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 18:52:33.735061 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:33.734964 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:52:37.032030 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.031993 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:52:37.032494 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.032289 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" containerID="cri-o://c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955" gracePeriod=30 Apr 16 18:52:37.093581 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.093541 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:52:37.094031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.094005 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" Apr 16 18:52:37.094031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.094029 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" Apr 16 18:52:37.094031 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.094037 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="storage-initializer" Apr 16 18:52:37.094288 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.094060 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="storage-initializer" Apr 16 18:52:37.094288 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.094135 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3af865b-8cd0-46be-801a-a019b09e1db3" containerName="kserve-container" Apr 16 18:52:37.097229 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.097212 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:37.104895 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.104842 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:52:37.113221 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.113197 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-lmv2p\" (UID: \"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:37.214064 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.214007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-lmv2p\" (UID: \"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:37.214495 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.214471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-lmv2p\" (UID: \"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:37.409084 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.408975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:37.534384 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.534345 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:52:37.537118 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:52:37.537084 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf9b3545_c4ff_4c6e_b63e_a54b75d28e8d.slice/crio-0c204810fa2d273cd62860410c22315494da0075a14c3dd68c2d3ea7fa80b07f WatchSource:0}: Error finding container 0c204810fa2d273cd62860410c22315494da0075a14c3dd68c2d3ea7fa80b07f: Status 404 returned error can't find the container with id 0c204810fa2d273cd62860410c22315494da0075a14c3dd68c2d3ea7fa80b07f Apr 16 18:52:37.983554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.983522 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerStarted","Data":"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672"} Apr 16 18:52:37.983554 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:37.983560 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerStarted","Data":"0c204810fa2d273cd62860410c22315494da0075a14c3dd68c2d3ea7fa80b07f"} Apr 16 18:52:41.380865 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.380835 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:52:41.455767 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.455730 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location\") pod \"bde7611b-263b-43ca-ba04-e19d4ab45349\" (UID: \"bde7611b-263b-43ca-ba04-e19d4ab45349\") " Apr 16 18:52:41.456196 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.456170 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bde7611b-263b-43ca-ba04-e19d4ab45349" (UID: "bde7611b-263b-43ca-ba04-e19d4ab45349"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:41.557178 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.557142 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bde7611b-263b-43ca-ba04-e19d4ab45349-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:52:41.997975 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.997940 2567 generic.go:358] "Generic (PLEG): container finished" podID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerID="c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955" exitCode=0 Apr 16 18:52:41.998152 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.998016 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" Apr 16 18:52:41.998152 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.998021 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerDied","Data":"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955"} Apr 16 18:52:41.998152 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.998076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr" event={"ID":"bde7611b-263b-43ca-ba04-e19d4ab45349","Type":"ContainerDied","Data":"77c1b51376c56638fd2145e5f940bf87363f93fa480a653d44b73595d31086a2"} Apr 16 18:52:41.998152 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.998097 2567 scope.go:117] "RemoveContainer" containerID="c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955" Apr 16 18:52:41.999614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.999593 2567 generic.go:358] "Generic (PLEG): container finished" podID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerID="70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672" exitCode=0 Apr 16 18:52:41.999702 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:41.999639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerDied","Data":"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672"} Apr 16 18:52:42.006523 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.006393 2567 scope.go:117] "RemoveContainer" containerID="37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc" Apr 16 18:52:42.013690 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.013666 2567 scope.go:117] "RemoveContainer" containerID="c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955" Apr 16 18:52:42.013967 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:52:42.013950 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955\": container with ID starting with c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955 not found: ID does not exist" containerID="c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955" Apr 16 18:52:42.014026 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.013975 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955"} err="failed to get container status \"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955\": rpc error: code = NotFound desc = could not find container \"c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955\": container with ID starting with c289481c8435b4f50fcd81dc2a1870c9f2d232ea8ae20fd0db4e8236c4660955 not found: ID does not exist" Apr 16 18:52:42.014026 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.013991 2567 scope.go:117] "RemoveContainer" containerID="37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc" Apr 16 18:52:42.014248 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:52:42.014231 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc\": container with ID starting with 37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc not found: ID does not exist" containerID="37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc" Apr 16 18:52:42.014289 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.014257 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc"} err="failed to get container status \"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc\": rpc error: code = NotFound desc = could not find container \"37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc\": container with ID starting with 37e764f6820594d55f6750a17ccff192ea9c2005e3c7f065fa6b26c60a12addc not found: ID does not exist" Apr 16 18:52:42.032433 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.032400 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:52:42.035717 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:42.035691 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67bb8f7cbc-msrkr"] Apr 16 18:52:43.004572 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:43.004541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerStarted","Data":"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751"} Apr 16 18:52:43.004916 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:43.004738 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:52:43.022679 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:43.022633 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" podStartSLOduration=6.022620454 podStartE2EDuration="6.022620454s" podCreationTimestamp="2026-04-16 18:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:52:43.021331366 +0000 UTC m=+2524.404994857" watchObservedRunningTime="2026-04-16 18:52:43.022620454 +0000 UTC m=+2524.406283925" Apr 16 18:52:43.243790 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:52:43.243750 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" path="/var/lib/kubelet/pods/bde7611b-263b-43ca-ba04-e19d4ab45349/volumes" Apr 16 18:53:14.037937 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:14.037895 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 18:53:24.010595 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:24.010566 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:53:27.183268 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.183231 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:53:27.184142 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.184108 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" containerID="cri-o://486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751" gracePeriod=30 Apr 16 18:53:27.242809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.242777 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:53:27.243233 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.243217 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" Apr 16 18:53:27.243305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.243236 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" Apr 16 18:53:27.243305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.243261 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="storage-initializer" Apr 16 18:53:27.243305 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.243266 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="storage-initializer" Apr 16 18:53:27.243408 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.243325 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bde7611b-263b-43ca-ba04-e19d4ab45349" containerName="kserve-container" Apr 16 18:53:27.246198 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.246175 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:27.254079 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.253872 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:53:27.349681 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.349637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-76xr4\" (UID: \"221bf3f0-0725-4ae7-a198-238422cb828b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:27.450805 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.450503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-76xr4\" (UID: \"221bf3f0-0725-4ae7-a198-238422cb828b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:27.451101 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.451081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-667495d4b9-76xr4\" (UID: \"221bf3f0-0725-4ae7-a198-238422cb828b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:27.557679 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.557641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:27.684263 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:27.684236 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:53:27.686417 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:53:27.686386 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221bf3f0_0725_4ae7_a198_238422cb828b.slice/crio-a01aee69a06947978b36ea90d75715e97737cec2cd2cae56f6800a3f0a5857d9 WatchSource:0}: Error finding container a01aee69a06947978b36ea90d75715e97737cec2cd2cae56f6800a3f0a5857d9: Status 404 returned error can't find the container with id a01aee69a06947978b36ea90d75715e97737cec2cd2cae56f6800a3f0a5857d9 Apr 16 18:53:28.154829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:28.154750 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerStarted","Data":"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e"} Apr 16 18:53:28.154829 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:28.154784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerStarted","Data":"a01aee69a06947978b36ea90d75715e97737cec2cd2cae56f6800a3f0a5857d9"} Apr 16 18:53:33.172375 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:33.172337 2567 generic.go:358] "Generic (PLEG): container finished" podID="221bf3f0-0725-4ae7-a198-238422cb828b" containerID="2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e" exitCode=0 Apr 16 18:53:33.172735 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:33.172409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerDied","Data":"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e"} Apr 16 18:53:34.009368 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.009324 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.59:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 18:53:34.177445 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.177409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerStarted","Data":"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2"} Apr 16 18:53:34.177928 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.177695 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:53:34.178879 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.178848 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 18:53:34.192905 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.192860 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podStartSLOduration=7.19284727 podStartE2EDuration="7.19284727s" podCreationTimestamp="2026-04-16 18:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:53:34.19175787 +0000 UTC m=+2575.575421342" watchObservedRunningTime="2026-04-16 18:53:34.19284727 +0000 UTC m=+2575.576510741" Apr 16 18:53:34.536684 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.536661 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:53:34.617342 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.617260 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location\") pod \"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d\" (UID: \"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d\") " Apr 16 18:53:34.617563 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.617540 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" (UID: "df9b3545-c4ff-4c6e-b63e-a54b75d28e8d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:34.718771 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:34.718733 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:53:35.182727 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.182694 2567 generic.go:358] "Generic (PLEG): container finished" podID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerID="486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751" exitCode=0 Apr 16 18:53:35.183126 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.182771 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerDied","Data":"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751"} Apr 16 18:53:35.183126 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.182811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" event={"ID":"df9b3545-c4ff-4c6e-b63e-a54b75d28e8d","Type":"ContainerDied","Data":"0c204810fa2d273cd62860410c22315494da0075a14c3dd68c2d3ea7fa80b07f"} Apr 16 18:53:35.183126 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.182827 2567 scope.go:117] "RemoveContainer" containerID="486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751" Apr 16 18:53:35.183126 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.182778 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p" Apr 16 18:53:35.183355 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.183241 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 18:53:35.191343 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.191321 2567 scope.go:117] "RemoveContainer" containerID="70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672" Apr 16 18:53:35.198615 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.198593 2567 scope.go:117] "RemoveContainer" containerID="486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751" Apr 16 18:53:35.198864 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:53:35.198847 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751\": container with ID starting with 486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751 not found: ID does not exist" containerID="486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751" Apr 16 18:53:35.198913 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.198871 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751"} err="failed to get container status \"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751\": rpc error: code = NotFound desc = could not find container \"486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751\": container with ID starting with 486733b064f6fb43df6977f2ecec1fa3012c661635e1a37a77570768fc65b751 not found: ID does not exist" Apr 16 18:53:35.198913 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.198887 2567 scope.go:117] "RemoveContainer" containerID="70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672" Apr 16 18:53:35.199135 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:53:35.199121 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672\": container with ID starting with 70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672 not found: ID does not exist" containerID="70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672" Apr 16 18:53:35.199192 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.199138 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672"} err="failed to get container status \"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672\": rpc error: code = NotFound desc = could not find container \"70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672\": container with ID starting with 70f5c6941d18d24f53ffdae82401edffb3f04368dff162e13c47e40685922672 not found: ID does not exist" Apr 16 18:53:35.205999 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.205956 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:53:35.208476 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.208452 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-lmv2p"] Apr 16 18:53:35.241861 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:35.241833 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" path="/var/lib/kubelet/pods/df9b3545-c4ff-4c6e-b63e-a54b75d28e8d/volumes" Apr 16 18:53:45.183271 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:45.183225 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 18:53:55.184717 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:53:55.184683 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:54:04.296618 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.296580 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-667495d4b9-76xr4_221bf3f0-0725-4ae7-a198-238422cb828b/kserve-container/0.log" Apr 16 18:54:04.436547 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.436511 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:54:04.436930 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.436871 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" containerID="cri-o://3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2" gracePeriod=30 Apr 16 18:54:04.491160 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491125 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:54:04.491485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491471 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="storage-initializer" Apr 16 18:54:04.491529 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491487 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="storage-initializer" Apr 16 18:54:04.491529 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491496 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" Apr 16 18:54:04.491529 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491503 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" Apr 16 18:54:04.491623 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.491594 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="df9b3545-c4ff-4c6e-b63e-a54b75d28e8d" containerName="kserve-container" Apr 16 18:54:04.494900 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.494881 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:04.505639 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.505614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:54:04.677831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.677743 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp\" (UID: \"315707fd-4b51-456b-9717-86bc1f225c56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:04.778607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.778573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp\" (UID: \"315707fd-4b51-456b-9717-86bc1f225c56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:04.778950 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.778927 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp\" (UID: \"315707fd-4b51-456b-9717-86bc1f225c56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:04.806524 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:04.806490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:05.022406 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.022381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:54:05.024711 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:54:05.024683 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod315707fd_4b51_456b_9717_86bc1f225c56.slice/crio-0ff05ff0bbb4344f42d448a1f3efbeeaf8217a7f2ef95f48379cc09ea6c4a4fe WatchSource:0}: Error finding container 0ff05ff0bbb4344f42d448a1f3efbeeaf8217a7f2ef95f48379cc09ea6c4a4fe: Status 404 returned error can't find the container with id 0ff05ff0bbb4344f42d448a1f3efbeeaf8217a7f2ef95f48379cc09ea6c4a4fe Apr 16 18:54:05.183847 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.183802 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 18:54:05.283389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.283287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerStarted","Data":"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066"} Apr 16 18:54:05.283389 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.283326 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerStarted","Data":"0ff05ff0bbb4344f42d448a1f3efbeeaf8217a7f2ef95f48379cc09ea6c4a4fe"} Apr 16 18:54:05.685207 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.685184 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:54:05.884778 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.884745 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location\") pod \"221bf3f0-0725-4ae7-a198-238422cb828b\" (UID: \"221bf3f0-0725-4ae7-a198-238422cb828b\") " Apr 16 18:54:05.891630 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.891598 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "221bf3f0-0725-4ae7-a198-238422cb828b" (UID: "221bf3f0-0725-4ae7-a198-238422cb828b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:05.985781 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:05.985742 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/221bf3f0-0725-4ae7-a198-238422cb828b-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:54:06.287313 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.287272 2567 generic.go:358] "Generic (PLEG): container finished" podID="221bf3f0-0725-4ae7-a198-238422cb828b" containerID="3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2" exitCode=0 Apr 16 18:54:06.287485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.287344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerDied","Data":"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2"} Apr 16 18:54:06.287485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.287371 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" Apr 16 18:54:06.287485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.287383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4" event={"ID":"221bf3f0-0725-4ae7-a198-238422cb828b","Type":"ContainerDied","Data":"a01aee69a06947978b36ea90d75715e97737cec2cd2cae56f6800a3f0a5857d9"} Apr 16 18:54:06.287485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.287397 2567 scope.go:117] "RemoveContainer" containerID="3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2" Apr 16 18:54:06.295402 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.295382 2567 scope.go:117] "RemoveContainer" containerID="2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e" Apr 16 18:54:06.302560 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.302542 2567 scope.go:117] "RemoveContainer" containerID="3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2" Apr 16 18:54:06.302809 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:54:06.302786 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2\": container with ID starting with 3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2 not found: ID does not exist" containerID="3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2" Apr 16 18:54:06.302909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.302814 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2"} err="failed to get container status \"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2\": rpc error: code = NotFound desc = could not find container \"3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2\": container with ID starting with 3fd374b99791f2daf2983b3c4cbd62780797de09fde79f798a9bde4920606de2 not found: ID does not exist" Apr 16 18:54:06.302909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.302833 2567 scope.go:117] "RemoveContainer" containerID="2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e" Apr 16 18:54:06.303092 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:54:06.303074 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e\": container with ID starting with 2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e not found: ID does not exist" containerID="2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e" Apr 16 18:54:06.303139 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.303099 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e"} err="failed to get container status \"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e\": rpc error: code = NotFound desc = could not find container \"2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e\": container with ID starting with 2ed599813c7de33037d450230ef1b57ffa901ea0ad5d4fb57868d4bb01f7c82e not found: ID does not exist" Apr 16 18:54:06.308831 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.308808 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:54:06.311681 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:06.311663 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-667495d4b9-76xr4"] Apr 16 18:54:07.242144 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:07.242113 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" path="/var/lib/kubelet/pods/221bf3f0-0725-4ae7-a198-238422cb828b/volumes" Apr 16 18:54:09.299238 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:09.299209 2567 generic.go:358] "Generic (PLEG): container finished" podID="315707fd-4b51-456b-9717-86bc1f225c56" containerID="8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066" exitCode=0 Apr 16 18:54:09.299614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:09.299270 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerDied","Data":"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066"} Apr 16 18:54:10.304713 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:10.304679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerStarted","Data":"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea"} Apr 16 18:54:10.305111 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:10.304889 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:10.321790 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:10.321736 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" podStartSLOduration=6.32172272 podStartE2EDuration="6.32172272s" podCreationTimestamp="2026-04-16 18:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:54:10.320816021 +0000 UTC m=+2611.704479494" watchObservedRunningTime="2026-04-16 18:54:10.32172272 +0000 UTC m=+2611.705386245" Apr 16 18:54:41.338000 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:41.337946 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 18:54:51.311288 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:51.311254 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:54:54.606660 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.606624 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:54:54.607153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.606948 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" containerID="cri-o://c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea" gracePeriod=30 Apr 16 18:54:54.663987 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.663946 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:54:54.664346 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.664332 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="storage-initializer" Apr 16 18:54:54.664346 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.664347 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="storage-initializer" Apr 16 18:54:54.664441 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.664364 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" Apr 16 18:54:54.664441 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.664371 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" Apr 16 18:54:54.664441 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.664425 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="221bf3f0-0725-4ae7-a198-238422cb828b" containerName="kserve-container" Apr 16 18:54:54.667705 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.667685 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:54:54.679034 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.679007 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:54:54.802572 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.802536 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-9f894\" (UID: \"7a4f52fe-c0b0-4d74-8296-83991486e38e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:54:54.903071 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.902972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-9f894\" (UID: \"7a4f52fe-c0b0-4d74-8296-83991486e38e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:54:54.903368 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.903351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7c6977494c-9f894\" (UID: \"7a4f52fe-c0b0-4d74-8296-83991486e38e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:54:54.979485 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:54.979456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:54:55.099217 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:55.099189 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:54:55.101509 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:54:55.101487 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4f52fe_c0b0_4d74_8296_83991486e38e.slice/crio-5b5175fc142b0be0ccddb1978c2d6649da26079d530b43fe41aa170b7356c58d WatchSource:0}: Error finding container 5b5175fc142b0be0ccddb1978c2d6649da26079d530b43fe41aa170b7356c58d: Status 404 returned error can't find the container with id 5b5175fc142b0be0ccddb1978c2d6649da26079d530b43fe41aa170b7356c58d Apr 16 18:54:55.103457 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:55.103439 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:54:55.464571 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:55.464540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerStarted","Data":"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a"} Apr 16 18:54:55.464571 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:55.464574 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerStarted","Data":"5b5175fc142b0be0ccddb1978c2d6649da26079d530b43fe41aa170b7356c58d"} Apr 16 18:54:59.479356 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:59.479325 2567 generic.go:358] "Generic (PLEG): container finished" podID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerID="a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a" exitCode=0 Apr 16 18:54:59.479757 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:54:59.479382 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerDied","Data":"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a"} Apr 16 18:55:00.484290 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:00.484254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerStarted","Data":"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2"} Apr 16 18:55:00.484672 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:00.484532 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:55:00.486056 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:00.486018 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:00.501669 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:00.501621 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podStartSLOduration=6.501610107 podStartE2EDuration="6.501610107s" podCreationTimestamp="2026-04-16 18:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:55:00.500432684 +0000 UTC m=+2661.884096156" watchObservedRunningTime="2026-04-16 18:55:00.501610107 +0000 UTC m=+2661.885273578" Apr 16 18:55:01.308564 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:01.308524 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.61:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 18:55:01.488126 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:01.488085 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:01.952614 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:01.952590 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:55:01.959628 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:01.959609 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location\") pod \"315707fd-4b51-456b-9717-86bc1f225c56\" (UID: \"315707fd-4b51-456b-9717-86bc1f225c56\") " Apr 16 18:55:01.959893 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:01.959873 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "315707fd-4b51-456b-9717-86bc1f225c56" (UID: "315707fd-4b51-456b-9717-86bc1f225c56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:02.061410 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.060619 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/315707fd-4b51-456b-9717-86bc1f225c56-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:55:02.492946 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.492908 2567 generic.go:358] "Generic (PLEG): container finished" podID="315707fd-4b51-456b-9717-86bc1f225c56" containerID="c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea" exitCode=0 Apr 16 18:55:02.493386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.492989 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" Apr 16 18:55:02.493386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.492986 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerDied","Data":"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea"} Apr 16 18:55:02.493386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.493078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp" event={"ID":"315707fd-4b51-456b-9717-86bc1f225c56","Type":"ContainerDied","Data":"0ff05ff0bbb4344f42d448a1f3efbeeaf8217a7f2ef95f48379cc09ea6c4a4fe"} Apr 16 18:55:02.493386 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.493094 2567 scope.go:117] "RemoveContainer" containerID="c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea" Apr 16 18:55:02.501117 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.501098 2567 scope.go:117] "RemoveContainer" containerID="8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066" Apr 16 18:55:02.508585 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.508568 2567 scope.go:117] "RemoveContainer" containerID="c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea" Apr 16 18:55:02.508829 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:55:02.508811 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea\": container with ID starting with c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea not found: ID does not exist" containerID="c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea" Apr 16 18:55:02.508871 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.508838 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea"} err="failed to get container status \"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea\": rpc error: code = NotFound desc = could not find container \"c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea\": container with ID starting with c0f2432cfc17e7106c82887a8c4947d1c8711a9b05924daf60bf405a986c14ea not found: ID does not exist" Apr 16 18:55:02.508871 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.508855 2567 scope.go:117] "RemoveContainer" containerID="8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066" Apr 16 18:55:02.509106 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:55:02.509079 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066\": container with ID starting with 8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066 not found: ID does not exist" containerID="8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066" Apr 16 18:55:02.509148 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.509118 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066"} err="failed to get container status \"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066\": rpc error: code = NotFound desc = could not find container \"8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066\": container with ID starting with 8ec095e1814607d16ee99be788aefc1fd58b8c9f670682f0e264711553e6a066 not found: ID does not exist" Apr 16 18:55:02.514569 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.514550 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:55:02.521577 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:02.521554 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-7kzbp"] Apr 16 18:55:03.242963 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:03.242931 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315707fd-4b51-456b-9717-86bc1f225c56" path="/var/lib/kubelet/pods/315707fd-4b51-456b-9717-86bc1f225c56/volumes" Apr 16 18:55:11.488452 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:11.488399 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:21.488186 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:21.488145 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:31.488650 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:31.488556 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:41.488958 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:41.488904 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:55:51.488452 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:55:51.488401 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:56:01.488917 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:01.488870 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 18:56:11.241466 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:11.241436 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:56:14.875925 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.875872 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:56:14.876395 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.876261 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" containerID="cri-o://5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2" gracePeriod=30 Apr 16 18:56:14.939364 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939321 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:56:14.939685 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939673 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" Apr 16 18:56:14.939738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939687 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" Apr 16 18:56:14.939738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939695 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="storage-initializer" Apr 16 18:56:14.939738 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939719 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="storage-initializer" Apr 16 18:56:14.939832 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.939778 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="315707fd-4b51-456b-9717-86bc1f225c56" containerName="kserve-container" Apr 16 18:56:14.942448 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.942427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:14.953297 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.953258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:56:14.958808 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:14.958770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth\" (UID: \"690e0b75-d4f3-4541-bf43-2a81e232bd80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:15.060076 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.059993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth\" (UID: \"690e0b75-d4f3-4541-bf43-2a81e232bd80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:15.060429 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.060406 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth\" (UID: \"690e0b75-d4f3-4541-bf43-2a81e232bd80\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:15.254666 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.254627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:15.388314 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.388280 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:56:15.391077 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:56:15.391032 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod690e0b75_d4f3_4541_bf43_2a81e232bd80.slice/crio-eec45078a3709eee0c51f3281e77c73a94f34b004e1f5706c205d151a7623355 WatchSource:0}: Error finding container eec45078a3709eee0c51f3281e77c73a94f34b004e1f5706c205d151a7623355: Status 404 returned error can't find the container with id eec45078a3709eee0c51f3281e77c73a94f34b004e1f5706c205d151a7623355 Apr 16 18:56:15.735087 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.735028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerStarted","Data":"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de"} Apr 16 18:56:15.735087 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:15.735084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerStarted","Data":"eec45078a3709eee0c51f3281e77c73a94f34b004e1f5706c205d151a7623355"} Apr 16 18:56:19.727408 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.727383 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:56:19.751349 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.751316 2567 generic.go:358] "Generic (PLEG): container finished" podID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerID="5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2" exitCode=0 Apr 16 18:56:19.751508 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.751389 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" Apr 16 18:56:19.751508 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.751396 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerDied","Data":"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2"} Apr 16 18:56:19.751508 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.751436 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894" event={"ID":"7a4f52fe-c0b0-4d74-8296-83991486e38e","Type":"ContainerDied","Data":"5b5175fc142b0be0ccddb1978c2d6649da26079d530b43fe41aa170b7356c58d"} Apr 16 18:56:19.751508 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.751451 2567 scope.go:117] "RemoveContainer" containerID="5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2" Apr 16 18:56:19.752924 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.752904 2567 generic.go:358] "Generic (PLEG): container finished" podID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerID="a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de" exitCode=0 Apr 16 18:56:19.753012 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.752944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerDied","Data":"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de"} Apr 16 18:56:19.759761 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.759738 2567 scope.go:117] "RemoveContainer" containerID="a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a" Apr 16 18:56:19.767977 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.767953 2567 scope.go:117] "RemoveContainer" containerID="5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2" Apr 16 18:56:19.768310 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:56:19.768287 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2\": container with ID starting with 5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2 not found: ID does not exist" containerID="5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2" Apr 16 18:56:19.768415 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.768320 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2"} err="failed to get container status \"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2\": rpc error: code = NotFound desc = could not find container \"5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2\": container with ID starting with 5a3a9fe053da9bb3e9b17577d114df177c1317b20dc0495b317a7421db26cce2 not found: ID does not exist" Apr 16 18:56:19.768415 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.768339 2567 scope.go:117] "RemoveContainer" containerID="a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a" Apr 16 18:56:19.768656 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:56:19.768635 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a\": container with ID starting with a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a not found: ID does not exist" containerID="a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a" Apr 16 18:56:19.768731 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.768664 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a"} err="failed to get container status \"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a\": rpc error: code = NotFound desc = could not find container \"a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a\": container with ID starting with a4f92a8b189b00fe4eed2ee0ed05648720ce967c80ff61b74a3efa9a9298c73a not found: ID does not exist" Apr 16 18:56:19.794923 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.794893 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location\") pod \"7a4f52fe-c0b0-4d74-8296-83991486e38e\" (UID: \"7a4f52fe-c0b0-4d74-8296-83991486e38e\") " Apr 16 18:56:19.795234 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.795209 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7a4f52fe-c0b0-4d74-8296-83991486e38e" (UID: "7a4f52fe-c0b0-4d74-8296-83991486e38e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:19.895674 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:19.895634 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a4f52fe-c0b0-4d74-8296-83991486e38e-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:56:20.073789 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.073749 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:56:20.077655 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.077621 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7c6977494c-9f894"] Apr 16 18:56:20.760103 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.760066 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerStarted","Data":"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555"} Apr 16 18:56:20.760502 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.760363 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:56:20.761812 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.761787 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:56:20.777011 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:20.776963 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podStartSLOduration=6.776946905 podStartE2EDuration="6.776946905s" podCreationTimestamp="2026-04-16 18:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:20.775956553 +0000 UTC m=+2742.159620026" watchObservedRunningTime="2026-04-16 18:56:20.776946905 +0000 UTC m=+2742.160610374" Apr 16 18:56:21.242116 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:21.242083 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" path="/var/lib/kubelet/pods/7a4f52fe-c0b0-4d74-8296-83991486e38e/volumes" Apr 16 18:56:21.764191 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:21.764154 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:56:31.764217 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:31.764169 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:56:41.764239 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:41.764197 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:56:51.764883 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:56:51.764839 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:57:01.765101 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:01.765006 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:57:11.764586 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:11.764543 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:57:21.764292 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:21.764248 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 18:57:31.766212 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:31.766178 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:57:35.086819 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.086776 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:57:35.087245 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.087087 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" containerID="cri-o://095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555" gracePeriod=30 Apr 16 18:57:35.146597 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.146561 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:57:35.146991 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.146965 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="storage-initializer" Apr 16 18:57:35.146991 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.146987 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="storage-initializer" Apr 16 18:57:35.146991 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.146997 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" Apr 16 18:57:35.147198 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.147003 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" Apr 16 18:57:35.147198 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.147082 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a4f52fe-c0b0-4d74-8296-83991486e38e" containerName="kserve-container" Apr 16 18:57:35.150081 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.150062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:35.158093 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.157854 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:57:35.237311 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.237269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ttww7\" (UID: \"4a237d19-448c-490b-8cb1-3ff182a7c046\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:35.338515 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.338412 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ttww7\" (UID: \"4a237d19-448c-490b-8cb1-3ff182a7c046\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:35.338874 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.338851 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-ttww7\" (UID: \"4a237d19-448c-490b-8cb1-3ff182a7c046\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:35.462246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.462210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:35.590206 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:35.590106 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:57:35.592406 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:57:35.592377 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a237d19_448c_490b_8cb1_3ff182a7c046.slice/crio-bc3488c6763c0bcc9e31fa6dadee0cb3b9fa684d8eb06faa353bc0099f51a5ef WatchSource:0}: Error finding container bc3488c6763c0bcc9e31fa6dadee0cb3b9fa684d8eb06faa353bc0099f51a5ef: Status 404 returned error can't find the container with id bc3488c6763c0bcc9e31fa6dadee0cb3b9fa684d8eb06faa353bc0099f51a5ef Apr 16 18:57:36.013686 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:36.013651 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerStarted","Data":"d43369c6efeb74478e1b19b522fb9352c28a6e906d45895a9f8ba96ceee72bcf"} Apr 16 18:57:36.013686 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:36.013686 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerStarted","Data":"bc3488c6763c0bcc9e31fa6dadee0cb3b9fa684d8eb06faa353bc0099f51a5ef"} Apr 16 18:57:39.541317 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:39.541293 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:57:39.675282 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:39.675186 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location\") pod \"690e0b75-d4f3-4541-bf43-2a81e232bd80\" (UID: \"690e0b75-d4f3-4541-bf43-2a81e232bd80\") " Apr 16 18:57:39.675542 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:39.675516 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "690e0b75-d4f3-4541-bf43-2a81e232bd80" (UID: "690e0b75-d4f3-4541-bf43-2a81e232bd80"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:39.776367 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:39.776335 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/690e0b75-d4f3-4541-bf43-2a81e232bd80-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:57:40.028919 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.028880 2567 generic.go:358] "Generic (PLEG): container finished" podID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerID="095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555" exitCode=0 Apr 16 18:57:40.029107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.028952 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" Apr 16 18:57:40.029107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.028968 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerDied","Data":"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555"} Apr 16 18:57:40.029107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.029007 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth" event={"ID":"690e0b75-d4f3-4541-bf43-2a81e232bd80","Type":"ContainerDied","Data":"eec45078a3709eee0c51f3281e77c73a94f34b004e1f5706c205d151a7623355"} Apr 16 18:57:40.029107 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.029023 2567 scope.go:117] "RemoveContainer" containerID="095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555" Apr 16 18:57:40.038744 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.038725 2567 scope.go:117] "RemoveContainer" containerID="a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de" Apr 16 18:57:40.046610 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.046594 2567 scope.go:117] "RemoveContainer" containerID="095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555" Apr 16 18:57:40.046865 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:57:40.046842 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555\": container with ID starting with 095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555 not found: ID does not exist" containerID="095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555" Apr 16 18:57:40.046912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.046893 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555"} err="failed to get container status \"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555\": rpc error: code = NotFound desc = could not find container \"095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555\": container with ID starting with 095662219e9532512251f505fc535b9ea93455b6c682e49e8586547fc2f02555 not found: ID does not exist" Apr 16 18:57:40.046912 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.046910 2567 scope.go:117] "RemoveContainer" containerID="a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de" Apr 16 18:57:40.047158 ip-10-0-141-192 kubenswrapper[2567]: E0416 18:57:40.047138 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de\": container with ID starting with a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de not found: ID does not exist" containerID="a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de" Apr 16 18:57:40.047228 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.047168 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de"} err="failed to get container status \"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de\": rpc error: code = NotFound desc = could not find container \"a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de\": container with ID starting with a4c8e7672886d14356f067b5263d66e607e93216a634883390195c9fe069f9de not found: ID does not exist" Apr 16 18:57:40.051005 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.050983 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:57:40.057270 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:40.057252 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6d47cd9cb4-wshth"] Apr 16 18:57:41.034231 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:41.034199 2567 generic.go:358] "Generic (PLEG): container finished" podID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerID="d43369c6efeb74478e1b19b522fb9352c28a6e906d45895a9f8ba96ceee72bcf" exitCode=0 Apr 16 18:57:41.034607 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:41.034272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerDied","Data":"d43369c6efeb74478e1b19b522fb9352c28a6e906d45895a9f8ba96ceee72bcf"} Apr 16 18:57:41.241316 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:41.241284 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" path="/var/lib/kubelet/pods/690e0b75-d4f3-4541-bf43-2a81e232bd80/volumes" Apr 16 18:57:45.053759 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:45.053724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerStarted","Data":"6cf165c5c5346282a3c4d4ea867e7ece242c433c12b74ae92090a26f119169e7"} Apr 16 18:57:45.054182 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:45.054019 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:57:45.055353 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:45.055325 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 18:57:45.072300 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:45.072250 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" podStartSLOduration=6.468828215 podStartE2EDuration="10.072235188s" podCreationTimestamp="2026-04-16 18:57:35 +0000 UTC" firstStartedPulling="2026-04-16 18:57:41.035401487 +0000 UTC m=+2822.419064940" lastFinishedPulling="2026-04-16 18:57:44.638808464 +0000 UTC m=+2826.022471913" observedRunningTime="2026-04-16 18:57:45.072215582 +0000 UTC m=+2826.455879053" watchObservedRunningTime="2026-04-16 18:57:45.072235188 +0000 UTC m=+2826.455898660" Apr 16 18:57:46.057162 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:46.057120 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 18:57:56.057481 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:57:56.057430 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 18:58:06.058876 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:06.058845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:58:26.891213 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.891121 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:58:26.891881 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.891591 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" containerID="cri-o://6cf165c5c5346282a3c4d4ea867e7ece242c433c12b74ae92090a26f119169e7" gracePeriod=30 Apr 16 18:58:26.963681 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.963638 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:58:26.964153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.964125 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" Apr 16 18:58:26.964153 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.964146 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" Apr 16 18:58:26.964329 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.964172 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="storage-initializer" Apr 16 18:58:26.964329 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.964181 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="storage-initializer" Apr 16 18:58:26.964329 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.964261 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="690e0b75-d4f3-4541-bf43-2a81e232bd80" containerName="kserve-container" Apr 16 18:58:26.967540 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.967517 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:26.980933 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:26.980899 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:58:27.079901 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:27.079858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-98768\" (UID: \"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:27.180739 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:27.180639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-98768\" (UID: \"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:27.181054 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:27.181018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-98768\" (UID: \"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:27.281799 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:27.281758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:27.410441 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:27.410412 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:58:27.413230 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:58:27.413197 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660fb6c4_f9f1_43ce_ad32_6059ae0e9ab8.slice/crio-d7d70773bd2035c3194266963035c13d8fc94deb20be73b074cf9abca0041aed WatchSource:0}: Error finding container d7d70773bd2035c3194266963035c13d8fc94deb20be73b074cf9abca0041aed: Status 404 returned error can't find the container with id d7d70773bd2035c3194266963035c13d8fc94deb20be73b074cf9abca0041aed Apr 16 18:58:28.200382 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:28.200345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerStarted","Data":"13be432d7f4b00942ae083b45b4841d4d13962593ed06ad13f674b27494eb0cf"} Apr 16 18:58:28.200382 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:28.200385 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerStarted","Data":"d7d70773bd2035c3194266963035c13d8fc94deb20be73b074cf9abca0041aed"} Apr 16 18:58:32.214201 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:32.214167 2567 generic.go:358] "Generic (PLEG): container finished" podID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerID="13be432d7f4b00942ae083b45b4841d4d13962593ed06ad13f674b27494eb0cf" exitCode=0 Apr 16 18:58:32.214582 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:32.214235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerDied","Data":"13be432d7f4b00942ae083b45b4841d4d13962593ed06ad13f674b27494eb0cf"} Apr 16 18:58:33.219281 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:33.219246 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerStarted","Data":"f43068547136feb3960e2f2d4fc8f97ed24cd7bbc153f2faf79d11bb0601df11"} Apr 16 18:58:33.219645 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:33.219527 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:33.220941 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:33.220912 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 18:58:33.235909 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:33.235839 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" podStartSLOduration=7.235818986 podStartE2EDuration="7.235818986s" podCreationTimestamp="2026-04-16 18:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:58:33.234873905 +0000 UTC m=+2874.618537377" watchObservedRunningTime="2026-04-16 18:58:33.235818986 +0000 UTC m=+2874.619482458" Apr 16 18:58:34.222447 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:34.222407 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 18:58:44.223450 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:44.223422 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:58:57.304348 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.304317 2567 generic.go:358] "Generic (PLEG): container finished" podID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerID="6cf165c5c5346282a3c4d4ea867e7ece242c433c12b74ae92090a26f119169e7" exitCode=137 Apr 16 18:58:57.304740 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.304405 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerDied","Data":"6cf165c5c5346282a3c4d4ea867e7ece242c433c12b74ae92090a26f119169e7"} Apr 16 18:58:57.529323 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.529300 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:58:57.646005 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.645918 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location\") pod \"4a237d19-448c-490b-8cb1-3ff182a7c046\" (UID: \"4a237d19-448c-490b-8cb1-3ff182a7c046\") " Apr 16 18:58:57.656640 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.656615 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a237d19-448c-490b-8cb1-3ff182a7c046" (UID: "4a237d19-448c-490b-8cb1-3ff182a7c046"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:58:57.747052 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:57.747001 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a237d19-448c-490b-8cb1-3ff182a7c046-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:58:58.286446 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.286405 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:58:58.287246 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.287208 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" containerID="cri-o://f43068547136feb3960e2f2d4fc8f97ed24cd7bbc153f2faf79d11bb0601df11" gracePeriod=30 Apr 16 18:58:58.310155 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.310072 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" event={"ID":"4a237d19-448c-490b-8cb1-3ff182a7c046","Type":"ContainerDied","Data":"bc3488c6763c0bcc9e31fa6dadee0cb3b9fa684d8eb06faa353bc0099f51a5ef"} Apr 16 18:58:58.310155 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.310138 2567 scope.go:117] "RemoveContainer" containerID="6cf165c5c5346282a3c4d4ea867e7ece242c433c12b74ae92090a26f119169e7" Apr 16 18:58:58.310696 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.310495 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7" Apr 16 18:58:58.320809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.320775 2567 scope.go:117] "RemoveContainer" containerID="d43369c6efeb74478e1b19b522fb9352c28a6e906d45895a9f8ba96ceee72bcf" Apr 16 18:58:58.338745 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.338722 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 18:58:58.339105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.339083 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="storage-initializer" Apr 16 18:58:58.339105 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.339102 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="storage-initializer" Apr 16 18:58:58.339285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.339130 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" Apr 16 18:58:58.339285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.339137 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" Apr 16 18:58:58.339285 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.339204 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" containerName="kserve-container" Apr 16 18:58:58.343987 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.343921 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 18:58:58.344876 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.344852 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:58:58.347218 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.347196 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-ttww7"] Apr 16 18:58:58.352848 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.352825 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 18:58:58.452585 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.452549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-k96gj\" (UID: \"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 18:58:58.553377 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.553291 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-k96gj\" (UID: \"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 18:58:58.553676 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.553657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-k96gj\" (UID: \"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 18:58:58.655095 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.655015 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 18:58:58.778921 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:58.778888 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 18:58:58.782307 ip-10-0-141-192 kubenswrapper[2567]: W0416 18:58:58.782278 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdaa0b5e_f1c9_43a2_bbd5_5e8fbf1d2a45.slice/crio-d27a36f9811b22a5bc55e9a9ab0b2d1e278d0dcd8c0c6a4a2c6a1f107cdb2c09 WatchSource:0}: Error finding container d27a36f9811b22a5bc55e9a9ab0b2d1e278d0dcd8c0c6a4a2c6a1f107cdb2c09: Status 404 returned error can't find the container with id d27a36f9811b22a5bc55e9a9ab0b2d1e278d0dcd8c0c6a4a2c6a1f107cdb2c09 Apr 16 18:58:59.248497 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:59.248464 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a237d19-448c-490b-8cb1-3ff182a7c046" path="/var/lib/kubelet/pods/4a237d19-448c-490b-8cb1-3ff182a7c046/volumes" Apr 16 18:58:59.315809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:59.315773 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerStarted","Data":"0c8dc485dd565b407ac411447c017a73afb48710327fe7a8dac01fd976dcfb2a"} Apr 16 18:58:59.315809 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:58:59.315811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerStarted","Data":"d27a36f9811b22a5bc55e9a9ab0b2d1e278d0dcd8c0c6a4a2c6a1f107cdb2c09"} Apr 16 18:59:03.329526 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:03.329493 2567 generic.go:358] "Generic (PLEG): container finished" podID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerID="0c8dc485dd565b407ac411447c017a73afb48710327fe7a8dac01fd976dcfb2a" exitCode=0 Apr 16 18:59:03.329903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:03.329544 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerDied","Data":"0c8dc485dd565b407ac411447c017a73afb48710327fe7a8dac01fd976dcfb2a"} Apr 16 18:59:28.451177 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:28.451087 2567 generic.go:358] "Generic (PLEG): container finished" podID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerID="f43068547136feb3960e2f2d4fc8f97ed24cd7bbc153f2faf79d11bb0601df11" exitCode=137 Apr 16 18:59:28.451177 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:28.451156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerDied","Data":"f43068547136feb3960e2f2d4fc8f97ed24cd7bbc153f2faf79d11bb0601df11"} Apr 16 18:59:29.040722 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.040696 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:59:29.161827 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.161738 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location\") pod \"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8\" (UID: \"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8\") " Apr 16 18:59:29.165775 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.165741 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" (UID: "660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:29.262732 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.262695 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 18:59:29.457653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.457568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" event={"ID":"660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8","Type":"ContainerDied","Data":"d7d70773bd2035c3194266963035c13d8fc94deb20be73b074cf9abca0041aed"} Apr 16 18:59:29.457653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.457596 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768" Apr 16 18:59:29.457653 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.457618 2567 scope.go:117] "RemoveContainer" containerID="f43068547136feb3960e2f2d4fc8f97ed24cd7bbc153f2faf79d11bb0601df11" Apr 16 18:59:29.469197 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.469161 2567 scope.go:117] "RemoveContainer" containerID="13be432d7f4b00942ae083b45b4841d4d13962593ed06ad13f674b27494eb0cf" Apr 16 18:59:29.475406 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.475384 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:59:29.478903 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:29.478882 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-98768"] Apr 16 18:59:31.243262 ip-10-0-141-192 kubenswrapper[2567]: I0416 18:59:31.243008 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" path="/var/lib/kubelet/pods/660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8/volumes" Apr 16 19:00:58.800413 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:00:58.800374 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerStarted","Data":"896eb214d7855b6c85da05b015f43cf07bb51efc9394075ccba5034abbbe5d9d"} Apr 16 19:00:58.800855 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:00:58.800601 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 19:00:58.802059 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:00:58.802017 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 19:00:58.819901 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:00:58.819847 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" podStartSLOduration=6.173624176 podStartE2EDuration="2m0.819831182s" podCreationTimestamp="2026-04-16 18:58:58 +0000 UTC" firstStartedPulling="2026-04-16 18:59:03.330541521 +0000 UTC m=+2904.714204970" lastFinishedPulling="2026-04-16 19:00:57.976748304 +0000 UTC m=+3019.360411976" observedRunningTime="2026-04-16 19:00:58.81789058 +0000 UTC m=+3020.201554067" watchObservedRunningTime="2026-04-16 19:00:58.819831182 +0000 UTC m=+3020.203494653" Apr 16 19:00:59.804216 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:00:59.804179 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 19:01:09.805690 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:09.805661 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 19:01:20.023129 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.023091 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 19:01:20.023571 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.023454 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" containerID="cri-o://896eb214d7855b6c85da05b015f43cf07bb51efc9394075ccba5034abbbe5d9d" gracePeriod=30 Apr 16 19:01:20.085113 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085061 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:01:20.085448 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085434 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" Apr 16 19:01:20.085513 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085449 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" Apr 16 19:01:20.085513 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085469 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="storage-initializer" Apr 16 19:01:20.085513 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085476 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="storage-initializer" Apr 16 19:01:20.085629 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.085544 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="660fb6c4-f9f1-43ce-ad32-6059ae0e9ab8" containerName="kserve-container" Apr 16 19:01:20.088822 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.088760 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:20.097928 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.097845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:01:20.171753 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.171709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7rkhr\" (UID: \"8253825e-1e4a-4b93-bfc6-9d3f48b38c75\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:20.272909 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.272868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7rkhr\" (UID: \"8253825e-1e4a-4b93-bfc6-9d3f48b38c75\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:20.273382 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.273361 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7rkhr\" (UID: \"8253825e-1e4a-4b93-bfc6-9d3f48b38c75\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:20.400951 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.400852 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:20.527104 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.527075 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:01:20.529981 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:01:20.529938 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8253825e_1e4a_4b93_bfc6_9d3f48b38c75.slice/crio-2f3c1a468de3718a3d285b764c2cb11f4ad606ad8fe09b6832dffbb70894034e WatchSource:0}: Error finding container 2f3c1a468de3718a3d285b764c2cb11f4ad606ad8fe09b6832dffbb70894034e: Status 404 returned error can't find the container with id 2f3c1a468de3718a3d285b764c2cb11f4ad606ad8fe09b6832dffbb70894034e Apr 16 19:01:20.532401 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.532377 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:01:20.875022 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.874988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerStarted","Data":"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870"} Apr 16 19:01:20.875022 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:20.875028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerStarted","Data":"2f3c1a468de3718a3d285b764c2cb11f4ad606ad8fe09b6832dffbb70894034e"} Apr 16 19:01:22.884450 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:22.884416 2567 generic.go:358] "Generic (PLEG): container finished" podID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerID="896eb214d7855b6c85da05b015f43cf07bb51efc9394075ccba5034abbbe5d9d" exitCode=0 Apr 16 19:01:22.884859 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:22.884491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerDied","Data":"896eb214d7855b6c85da05b015f43cf07bb51efc9394075ccba5034abbbe5d9d"} Apr 16 19:01:22.982626 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:22.982602 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 19:01:22.991029 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:22.991004 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location\") pod \"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45\" (UID: \"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45\") " Apr 16 19:01:22.991410 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:22.991388 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" (UID: "bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:01:23.092167 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.092070 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:01:23.888979 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.888944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" event={"ID":"bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45","Type":"ContainerDied","Data":"d27a36f9811b22a5bc55e9a9ab0b2d1e278d0dcd8c0c6a4a2c6a1f107cdb2c09"} Apr 16 19:01:23.889424 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.888996 2567 scope.go:117] "RemoveContainer" containerID="896eb214d7855b6c85da05b015f43cf07bb51efc9394075ccba5034abbbe5d9d" Apr 16 19:01:23.889424 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.889010 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj" Apr 16 19:01:23.897148 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.897120 2567 scope.go:117] "RemoveContainer" containerID="0c8dc485dd565b407ac411447c017a73afb48710327fe7a8dac01fd976dcfb2a" Apr 16 19:01:23.913027 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.912997 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 19:01:23.915735 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:23.915707 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-k96gj"] Apr 16 19:01:24.893775 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:24.893743 2567 generic.go:358] "Generic (PLEG): container finished" podID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerID="be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870" exitCode=0 Apr 16 19:01:24.894274 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:24.893818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerDied","Data":"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870"} Apr 16 19:01:25.242492 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:25.242453 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" path="/var/lib/kubelet/pods/bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45/volumes" Apr 16 19:01:43.967812 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:43.967776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerStarted","Data":"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078"} Apr 16 19:01:43.968253 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:43.968075 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:01:43.969434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:43.969412 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:01:43.983470 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:43.983414 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podStartSLOduration=5.21265704 podStartE2EDuration="23.983396003s" podCreationTimestamp="2026-04-16 19:01:20 +0000 UTC" firstStartedPulling="2026-04-16 19:01:24.895283367 +0000 UTC m=+3046.278946820" lastFinishedPulling="2026-04-16 19:01:43.66602233 +0000 UTC m=+3065.049685783" observedRunningTime="2026-04-16 19:01:43.982367596 +0000 UTC m=+3065.366031068" watchObservedRunningTime="2026-04-16 19:01:43.983396003 +0000 UTC m=+3065.367059475" Apr 16 19:01:44.971273 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:44.971229 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:01:54.972128 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:01:54.972083 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:02:04.971274 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:04.971227 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:02:14.972233 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:14.972188 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:02:24.972145 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:24.972105 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:02:34.972264 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:34.972210 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 19:02:44.972811 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:44.972779 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:02:50.210123 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.210035 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:02:50.210618 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.210503 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" containerID="cri-o://c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078" gracePeriod=30 Apr 16 19:02:50.281946 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.281903 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:02:50.282293 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.282276 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="storage-initializer" Apr 16 19:02:50.282293 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.282294 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="storage-initializer" Apr 16 19:02:50.282416 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.282313 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" Apr 16 19:02:50.282416 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.282319 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" Apr 16 19:02:50.282416 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.282400 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdaa0b5e-f1c9-43a2-bbd5-5e8fbf1d2a45" containerName="kserve-container" Apr 16 19:02:50.285419 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.285401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:50.294978 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.294949 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:02:50.442551 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.442496 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg\" (UID: \"fb6e9802-79d7-48ad-a2db-a9a676dd8af5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:50.543653 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.543540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg\" (UID: \"fb6e9802-79d7-48ad-a2db-a9a676dd8af5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:50.543952 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.543926 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg\" (UID: \"fb6e9802-79d7-48ad-a2db-a9a676dd8af5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:50.598397 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.598345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:50.718451 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:50.718426 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:02:50.720744 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:02:50.720716 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6e9802_79d7_48ad_a2db_a9a676dd8af5.slice/crio-6cacad4d25ae7cb65a2187ebf1ed96cf4abe537312063b4ad81f607a7c81926f WatchSource:0}: Error finding container 6cacad4d25ae7cb65a2187ebf1ed96cf4abe537312063b4ad81f607a7c81926f: Status 404 returned error can't find the container with id 6cacad4d25ae7cb65a2187ebf1ed96cf4abe537312063b4ad81f607a7c81926f Apr 16 19:02:51.191007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:51.190966 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerStarted","Data":"ff23154c2b0b656b526aa6e9c328cfc8573fde53b31eb66a75ca4639d98c1a36"} Apr 16 19:02:51.191007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:51.191011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerStarted","Data":"6cacad4d25ae7cb65a2187ebf1ed96cf4abe537312063b4ad81f607a7c81926f"} Apr 16 19:02:53.961553 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:53.961530 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:02:54.074938 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.074846 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location\") pod \"8253825e-1e4a-4b93-bfc6-9d3f48b38c75\" (UID: \"8253825e-1e4a-4b93-bfc6-9d3f48b38c75\") " Apr 16 19:02:54.075196 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.075174 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8253825e-1e4a-4b93-bfc6-9d3f48b38c75" (UID: "8253825e-1e4a-4b93-bfc6-9d3f48b38c75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:02:54.176241 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.176196 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8253825e-1e4a-4b93-bfc6-9d3f48b38c75-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:02:54.203272 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.203241 2567 generic.go:358] "Generic (PLEG): container finished" podID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerID="c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078" exitCode=0 Apr 16 19:02:54.203433 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.203284 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerDied","Data":"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078"} Apr 16 19:02:54.203433 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.203306 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" Apr 16 19:02:54.203433 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.203320 2567 scope.go:117] "RemoveContainer" containerID="c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078" Apr 16 19:02:54.203433 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.203310 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr" event={"ID":"8253825e-1e4a-4b93-bfc6-9d3f48b38c75","Type":"ContainerDied","Data":"2f3c1a468de3718a3d285b764c2cb11f4ad606ad8fe09b6832dffbb70894034e"} Apr 16 19:02:54.211520 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.211503 2567 scope.go:117] "RemoveContainer" containerID="be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870" Apr 16 19:02:54.218737 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.218719 2567 scope.go:117] "RemoveContainer" containerID="c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078" Apr 16 19:02:54.218987 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:02:54.218970 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078\": container with ID starting with c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078 not found: ID does not exist" containerID="c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078" Apr 16 19:02:54.219027 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.218998 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078"} err="failed to get container status \"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078\": rpc error: code = NotFound desc = could not find container \"c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078\": container with ID starting with c39bb2ef004dc98572696029f34a04bbe2d51b70bf44c7f209139310e3c8f078 not found: ID does not exist" Apr 16 19:02:54.219027 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.219017 2567 scope.go:117] "RemoveContainer" containerID="be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870" Apr 16 19:02:54.219318 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:02:54.219302 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870\": container with ID starting with be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870 not found: ID does not exist" containerID="be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870" Apr 16 19:02:54.219372 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.219324 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870"} err="failed to get container status \"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870\": rpc error: code = NotFound desc = could not find container \"be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870\": container with ID starting with be839c1e1de28f714d2f5b8ddd077483332a9481cf1034620259fc19ad4f3870 not found: ID does not exist" Apr 16 19:02:54.225074 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.225030 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:02:54.227987 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:54.227966 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7rkhr"] Apr 16 19:02:55.207866 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:55.207834 2567 generic.go:358] "Generic (PLEG): container finished" podID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerID="ff23154c2b0b656b526aa6e9c328cfc8573fde53b31eb66a75ca4639d98c1a36" exitCode=0 Apr 16 19:02:55.208318 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:55.207903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerDied","Data":"ff23154c2b0b656b526aa6e9c328cfc8573fde53b31eb66a75ca4639d98c1a36"} Apr 16 19:02:55.241667 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:55.241640 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" path="/var/lib/kubelet/pods/8253825e-1e4a-4b93-bfc6-9d3f48b38c75/volumes" Apr 16 19:02:56.213792 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:56.213758 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerStarted","Data":"dd8fc6072455e5d93f168d4574e3b54d42c5e8fbb8fcfb6d5cd88ba5887e668a"} Apr 16 19:02:56.214218 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:56.213980 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:02:56.250904 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:02:56.250855 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" podStartSLOduration=6.250840181 podStartE2EDuration="6.250840181s" podCreationTimestamp="2026-04-16 19:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:02:56.250105759 +0000 UTC m=+3137.633769233" watchObservedRunningTime="2026-04-16 19:02:56.250840181 +0000 UTC m=+3137.634503651" Apr 16 19:03:27.241637 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:27.241601 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:03:30.507484 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.507447 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:03:30.508464 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.508359 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="kserve-container" containerID="cri-o://dd8fc6072455e5d93f168d4574e3b54d42c5e8fbb8fcfb6d5cd88ba5887e668a" gracePeriod=30 Apr 16 19:03:30.553693 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.553664 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:03:30.554092 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.554076 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" Apr 16 19:03:30.554191 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.554094 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" Apr 16 19:03:30.554191 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.554110 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="storage-initializer" Apr 16 19:03:30.554191 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.554118 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="storage-initializer" Apr 16 19:03:30.554356 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.554203 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8253825e-1e4a-4b93-bfc6-9d3f48b38c75" containerName="kserve-container" Apr 16 19:03:30.557714 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.557688 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:30.564928 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.564897 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:03:30.586989 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.586953 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p9wnm\" (UID: \"c2af3a43-ec36-4b67-96a6-777101b83e66\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:30.688282 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.688249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p9wnm\" (UID: \"c2af3a43-ec36-4b67-96a6-777101b83e66\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:30.688636 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.688614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p9wnm\" (UID: \"c2af3a43-ec36-4b67-96a6-777101b83e66\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:30.871115 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:30.871014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:31.000854 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:31.000827 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:03:31.003092 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:03:31.003059 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2af3a43_ec36_4b67_96a6_777101b83e66.slice/crio-004bbb8c175a076d9afb2e51e130c8d3f65316f83db99abc162de717de751940 WatchSource:0}: Error finding container 004bbb8c175a076d9afb2e51e130c8d3f65316f83db99abc162de717de751940: Status 404 returned error can't find the container with id 004bbb8c175a076d9afb2e51e130c8d3f65316f83db99abc162de717de751940 Apr 16 19:03:31.335942 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:31.335896 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerStarted","Data":"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19"} Apr 16 19:03:31.335942 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:31.335949 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerStarted","Data":"004bbb8c175a076d9afb2e51e130c8d3f65316f83db99abc162de717de751940"} Apr 16 19:03:35.351192 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:35.351152 2567 generic.go:358] "Generic (PLEG): container finished" podID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerID="7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19" exitCode=0 Apr 16 19:03:35.351192 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:35.351193 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerDied","Data":"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19"} Apr 16 19:03:36.356061 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:36.356009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerStarted","Data":"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95"} Apr 16 19:03:36.356472 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:36.356252 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:03:36.375099 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:36.375034 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" podStartSLOduration=6.375018478 podStartE2EDuration="6.375018478s" podCreationTimestamp="2026-04-16 19:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:03:36.372429575 +0000 UTC m=+3177.756093046" watchObservedRunningTime="2026-04-16 19:03:36.375018478 +0000 UTC m=+3177.758681949" Apr 16 19:03:37.218343 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:37.218302 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.68:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 19:03:39.368787 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.368752 2567 generic.go:358] "Generic (PLEG): container finished" podID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerID="dd8fc6072455e5d93f168d4574e3b54d42c5e8fbb8fcfb6d5cd88ba5887e668a" exitCode=0 Apr 16 19:03:39.369294 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.368829 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerDied","Data":"dd8fc6072455e5d93f168d4574e3b54d42c5e8fbb8fcfb6d5cd88ba5887e668a"} Apr 16 19:03:39.457110 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.457086 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:03:39.564274 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.564168 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location\") pod \"fb6e9802-79d7-48ad-a2db-a9a676dd8af5\" (UID: \"fb6e9802-79d7-48ad-a2db-a9a676dd8af5\") " Apr 16 19:03:39.564562 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.564541 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb6e9802-79d7-48ad-a2db-a9a676dd8af5" (UID: "fb6e9802-79d7-48ad-a2db-a9a676dd8af5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:39.665258 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:39.665211 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb6e9802-79d7-48ad-a2db-a9a676dd8af5-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:03:40.373548 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.373505 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" event={"ID":"fb6e9802-79d7-48ad-a2db-a9a676dd8af5","Type":"ContainerDied","Data":"6cacad4d25ae7cb65a2187ebf1ed96cf4abe537312063b4ad81f607a7c81926f"} Apr 16 19:03:40.373548 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.373554 2567 scope.go:117] "RemoveContainer" containerID="dd8fc6072455e5d93f168d4574e3b54d42c5e8fbb8fcfb6d5cd88ba5887e668a" Apr 16 19:03:40.374013 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.373519 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg" Apr 16 19:03:40.382248 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.382228 2567 scope.go:117] "RemoveContainer" containerID="ff23154c2b0b656b526aa6e9c328cfc8573fde53b31eb66a75ca4639d98c1a36" Apr 16 19:03:40.395718 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.395689 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:03:40.398744 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:40.398716 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rpmjg"] Apr 16 19:03:41.241551 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:03:41.241515 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" path="/var/lib/kubelet/pods/fb6e9802-79d7-48ad-a2db-a9a676dd8af5/volumes" Apr 16 19:04:07.360708 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:07.360667 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.69:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 19:04:17.362925 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:17.362893 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:04:20.656869 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.656831 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:04:20.657422 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.657148 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="kserve-container" containerID="cri-o://9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95" gracePeriod=30 Apr 16 19:04:20.706100 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706034 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:04:20.706413 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706401 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="kserve-container" Apr 16 19:04:20.706458 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706415 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="kserve-container" Apr 16 19:04:20.706458 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706430 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="storage-initializer" Apr 16 19:04:20.706458 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706436 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="storage-initializer" Apr 16 19:04:20.706558 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.706492 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb6e9802-79d7-48ad-a2db-a9a676dd8af5" containerName="kserve-container" Apr 16 19:04:20.709630 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.709607 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:20.717603 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.717167 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:04:20.812419 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.812374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-mbj4x\" (UID: \"abdc2bec-6af6-46a8-af3a-ca2851ac9b50\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:20.914024 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.913921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-mbj4x\" (UID: \"abdc2bec-6af6-46a8-af3a-ca2851ac9b50\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:20.914329 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:20.914309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-mbj4x\" (UID: \"abdc2bec-6af6-46a8-af3a-ca2851ac9b50\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:21.022105 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:21.022071 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:21.138420 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:21.138385 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:04:21.143408 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:04:21.143372 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabdc2bec_6af6_46a8_af3a_ca2851ac9b50.slice/crio-4c48cb67aba236af6e059558a456814d9384057e3cb9b14549630954a74f9abb WatchSource:0}: Error finding container 4c48cb67aba236af6e059558a456814d9384057e3cb9b14549630954a74f9abb: Status 404 returned error can't find the container with id 4c48cb67aba236af6e059558a456814d9384057e3cb9b14549630954a74f9abb Apr 16 19:04:21.514912 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:21.514873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerStarted","Data":"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0"} Apr 16 19:04:21.515107 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:21.514921 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerStarted","Data":"4c48cb67aba236af6e059558a456814d9384057e3cb9b14549630954a74f9abb"} Apr 16 19:04:25.530391 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:25.530307 2567 generic.go:358] "Generic (PLEG): container finished" podID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerID="3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0" exitCode=0 Apr 16 19:04:25.530755 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:25.530381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerDied","Data":"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0"} Apr 16 19:04:26.535680 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:26.535649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerStarted","Data":"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853"} Apr 16 19:04:26.536117 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:26.535901 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:04:26.537292 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:26.537263 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:04:26.552290 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:26.552244 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podStartSLOduration=6.552229452 podStartE2EDuration="6.552229452s" podCreationTimestamp="2026-04-16 19:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:04:26.550470542 +0000 UTC m=+3227.934134012" watchObservedRunningTime="2026-04-16 19:04:26.552229452 +0000 UTC m=+3227.935892923" Apr 16 19:04:27.101368 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.101346 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:04:27.169401 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.169368 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location\") pod \"c2af3a43-ec36-4b67-96a6-777101b83e66\" (UID: \"c2af3a43-ec36-4b67-96a6-777101b83e66\") " Apr 16 19:04:27.169699 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.169677 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2af3a43-ec36-4b67-96a6-777101b83e66" (UID: "c2af3a43-ec36-4b67-96a6-777101b83e66"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:04:27.270720 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.270684 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2af3a43-ec36-4b67-96a6-777101b83e66-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:04:27.541462 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541376 2567 generic.go:358] "Generic (PLEG): container finished" podID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerID="9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95" exitCode=0 Apr 16 19:04:27.541462 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541441 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" Apr 16 19:04:27.541953 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541456 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerDied","Data":"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95"} Apr 16 19:04:27.541953 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm" event={"ID":"c2af3a43-ec36-4b67-96a6-777101b83e66","Type":"ContainerDied","Data":"004bbb8c175a076d9afb2e51e130c8d3f65316f83db99abc162de717de751940"} Apr 16 19:04:27.541953 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541525 2567 scope.go:117] "RemoveContainer" containerID="9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95" Apr 16 19:04:27.541953 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.541933 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:04:27.549935 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.549917 2567 scope.go:117] "RemoveContainer" containerID="7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19" Apr 16 19:04:27.557707 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.557686 2567 scope.go:117] "RemoveContainer" containerID="9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95" Apr 16 19:04:27.557982 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:04:27.557962 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95\": container with ID starting with 9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95 not found: ID does not exist" containerID="9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95" Apr 16 19:04:27.558068 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.557994 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95"} err="failed to get container status \"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95\": rpc error: code = NotFound desc = could not find container \"9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95\": container with ID starting with 9be6a0bb62da9de9e229009660dd4e6beda68b52fd614a83f04f6b8d41bd0e95 not found: ID does not exist" Apr 16 19:04:27.558068 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.558012 2567 scope.go:117] "RemoveContainer" containerID="7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19" Apr 16 19:04:27.558194 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.558167 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:04:27.558289 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:04:27.558272 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19\": container with ID starting with 7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19 not found: ID does not exist" containerID="7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19" Apr 16 19:04:27.558336 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.558293 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19"} err="failed to get container status \"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19\": rpc error: code = NotFound desc = could not find container \"7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19\": container with ID starting with 7c98274f64970bba837c7b3ae08f2e77af651fd77addb6d11fd438d634ca8a19 not found: ID does not exist" Apr 16 19:04:27.561636 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:27.561611 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p9wnm"] Apr 16 19:04:29.242720 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:29.242686 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" path="/var/lib/kubelet/pods/c2af3a43-ec36-4b67-96a6-777101b83e66/volumes" Apr 16 19:04:37.542402 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:37.542362 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:04:47.542454 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:47.542404 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:04:57.541937 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:04:57.541891 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:05:07.542723 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:07.542677 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:05:17.542653 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:17.542606 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 19:05:27.543227 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:27.543191 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:05:30.856419 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.856384 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:05:30.856797 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.856649 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" containerID="cri-o://2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853" gracePeriod=30 Apr 16 19:05:30.904398 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904357 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:05:30.904829 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904800 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="storage-initializer" Apr 16 19:05:30.904829 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904826 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="storage-initializer" Apr 16 19:05:30.905057 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904876 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="kserve-container" Apr 16 19:05:30.905057 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904890 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="kserve-container" Apr 16 19:05:30.905057 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.904975 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2af3a43-ec36-4b67-96a6-777101b83e66" containerName="kserve-container" Apr 16 19:05:30.909316 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.909296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:30.917208 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:30.917186 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:05:31.009881 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.009835 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9\" (UID: \"97c7fb9a-5ec3-49f4-ba30-de895847d8fb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:31.110860 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.110761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9\" (UID: \"97c7fb9a-5ec3-49f4-ba30-de895847d8fb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:31.111212 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.111192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9\" (UID: \"97c7fb9a-5ec3-49f4-ba30-de895847d8fb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:31.221845 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.221805 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:31.342356 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.342309 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:05:31.345796 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:05:31.345758 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c7fb9a_5ec3_49f4_ba30_de895847d8fb.slice/crio-0700a2d2c02bd498c25594a6486114001455488532480a3c5a467e53006f5507 WatchSource:0}: Error finding container 0700a2d2c02bd498c25594a6486114001455488532480a3c5a467e53006f5507: Status 404 returned error can't find the container with id 0700a2d2c02bd498c25594a6486114001455488532480a3c5a467e53006f5507 Apr 16 19:05:31.765944 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.765912 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerStarted","Data":"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d"} Apr 16 19:05:31.765944 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:31.765950 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerStarted","Data":"0700a2d2c02bd498c25594a6486114001455488532480a3c5a467e53006f5507"} Apr 16 19:05:34.506337 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.506312 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:05:34.640597 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.640511 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location\") pod \"abdc2bec-6af6-46a8-af3a-ca2851ac9b50\" (UID: \"abdc2bec-6af6-46a8-af3a-ca2851ac9b50\") " Apr 16 19:05:34.640821 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.640798 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abdc2bec-6af6-46a8-af3a-ca2851ac9b50" (UID: "abdc2bec-6af6-46a8-af3a-ca2851ac9b50"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:05:34.741785 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.741753 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abdc2bec-6af6-46a8-af3a-ca2851ac9b50-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:05:34.777518 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.777477 2567 generic.go:358] "Generic (PLEG): container finished" podID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerID="2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853" exitCode=0 Apr 16 19:05:34.777640 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.777523 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerDied","Data":"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853"} Apr 16 19:05:34.777640 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.777553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" event={"ID":"abdc2bec-6af6-46a8-af3a-ca2851ac9b50","Type":"ContainerDied","Data":"4c48cb67aba236af6e059558a456814d9384057e3cb9b14549630954a74f9abb"} Apr 16 19:05:34.777640 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.777552 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x" Apr 16 19:05:34.777640 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.777565 2567 scope.go:117] "RemoveContainer" containerID="2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853" Apr 16 19:05:34.785704 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.785665 2567 scope.go:117] "RemoveContainer" containerID="3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0" Apr 16 19:05:34.792977 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.792962 2567 scope.go:117] "RemoveContainer" containerID="2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853" Apr 16 19:05:34.793307 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:05:34.793287 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853\": container with ID starting with 2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853 not found: ID does not exist" containerID="2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853" Apr 16 19:05:34.793362 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.793318 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853"} err="failed to get container status \"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853\": rpc error: code = NotFound desc = could not find container \"2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853\": container with ID starting with 2634d24204bd3329abd53d81e8f796579ae1ccbef26a5c7ba7fd9aadb0142853 not found: ID does not exist" Apr 16 19:05:34.793362 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.793335 2567 scope.go:117] "RemoveContainer" containerID="3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0" Apr 16 19:05:34.793561 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:05:34.793542 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0\": container with ID starting with 3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0 not found: ID does not exist" containerID="3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0" Apr 16 19:05:34.793602 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.793567 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0"} err="failed to get container status \"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0\": rpc error: code = NotFound desc = could not find container \"3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0\": container with ID starting with 3433640f311d16990c638546324187a54f1f98d216d9f5d29bb108105bc9bdd0 not found: ID does not exist" Apr 16 19:05:34.799366 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.799346 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:05:34.807760 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:34.802875 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-mbj4x"] Apr 16 19:05:35.242314 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:35.242280 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" path="/var/lib/kubelet/pods/abdc2bec-6af6-46a8-af3a-ca2851ac9b50/volumes" Apr 16 19:05:35.782022 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:35.781989 2567 generic.go:358] "Generic (PLEG): container finished" podID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerID="d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d" exitCode=0 Apr 16 19:05:35.782448 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:35.782066 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerDied","Data":"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d"} Apr 16 19:05:36.788188 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:36.788155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerStarted","Data":"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08"} Apr 16 19:05:36.788564 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:36.788362 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:05:36.805656 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:05:36.805612 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" podStartSLOduration=6.805599109 podStartE2EDuration="6.805599109s" podCreationTimestamp="2026-04-16 19:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:05:36.803282882 +0000 UTC m=+3298.186946378" watchObservedRunningTime="2026-04-16 19:05:36.805599109 +0000 UTC m=+3298.189262580" Apr 16 19:06:07.838209 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:07.838166 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:06:17.837604 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:17.837544 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:06:27.794911 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:27.794878 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:06:31.017487 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.017441 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:06:31.017950 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.017786 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" containerID="cri-o://de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08" gracePeriod=30 Apr 16 19:06:31.077333 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077300 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:06:31.077673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077659 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="storage-initializer" Apr 16 19:06:31.077732 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077675 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="storage-initializer" Apr 16 19:06:31.077732 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077701 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" Apr 16 19:06:31.077732 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077707 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" Apr 16 19:06:31.077832 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.077766 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="abdc2bec-6af6-46a8-af3a-ca2851ac9b50" containerName="kserve-container" Apr 16 19:06:31.081517 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.081497 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:31.088288 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.088258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:06:31.214954 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.214918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-drwvz\" (UID: \"896a69c0-bfce-43a8-8155-9e6b08db3c25\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:31.315905 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.315820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-drwvz\" (UID: \"896a69c0-bfce-43a8-8155-9e6b08db3c25\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:31.316199 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.316181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-drwvz\" (UID: \"896a69c0-bfce-43a8-8155-9e6b08db3c25\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:31.393251 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.393218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:31.518718 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.518695 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:06:31.521375 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:06:31.521338 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896a69c0_bfce_43a8_8155_9e6b08db3c25.slice/crio-d63fd699ba3f1e21eb3511e2018fd2f9fc811bc87a476304255c30965d0a21f9 WatchSource:0}: Error finding container d63fd699ba3f1e21eb3511e2018fd2f9fc811bc87a476304255c30965d0a21f9: Status 404 returned error can't find the container with id d63fd699ba3f1e21eb3511e2018fd2f9fc811bc87a476304255c30965d0a21f9 Apr 16 19:06:31.523236 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.523213 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:06:31.971263 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.971223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerStarted","Data":"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1"} Apr 16 19:06:31.971263 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:31.971263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerStarted","Data":"d63fd699ba3f1e21eb3511e2018fd2f9fc811bc87a476304255c30965d0a21f9"} Apr 16 19:06:35.987243 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:35.987208 2567 generic.go:358] "Generic (PLEG): container finished" podID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerID="c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1" exitCode=0 Apr 16 19:06:35.987651 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:35.987259 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerDied","Data":"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1"} Apr 16 19:06:36.992917 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:36.992880 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerStarted","Data":"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c"} Apr 16 19:06:36.993383 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:36.993170 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:06:36.994596 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:36.994568 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:06:37.010992 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:37.010948 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podStartSLOduration=6.010930328 podStartE2EDuration="6.010930328s" podCreationTimestamp="2026-04-16 19:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:06:37.010160091 +0000 UTC m=+3358.393823564" watchObservedRunningTime="2026-04-16 19:06:37.010930328 +0000 UTC m=+3358.394593799" Apr 16 19:06:37.793199 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:37.793158 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.71:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.133.0.71:8080: connect: connection refused" Apr 16 19:06:37.996029 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:37.995978 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:06:38.561546 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:38.561521 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:06:38.676116 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:38.676016 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location\") pod \"97c7fb9a-5ec3-49f4-ba30-de895847d8fb\" (UID: \"97c7fb9a-5ec3-49f4-ba30-de895847d8fb\") " Apr 16 19:06:38.676347 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:38.676319 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97c7fb9a-5ec3-49f4-ba30-de895847d8fb" (UID: "97c7fb9a-5ec3-49f4-ba30-de895847d8fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:06:38.776928 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:38.776890 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c7fb9a-5ec3-49f4-ba30-de895847d8fb-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:06:39.000289 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.000255 2567 generic.go:358] "Generic (PLEG): container finished" podID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerID="de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08" exitCode=0 Apr 16 19:06:39.000698 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.000337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerDied","Data":"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08"} Apr 16 19:06:39.000698 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.000385 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" event={"ID":"97c7fb9a-5ec3-49f4-ba30-de895847d8fb","Type":"ContainerDied","Data":"0700a2d2c02bd498c25594a6486114001455488532480a3c5a467e53006f5507"} Apr 16 19:06:39.000698 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.000400 2567 scope.go:117] "RemoveContainer" containerID="de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08" Apr 16 19:06:39.000698 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.000347 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9" Apr 16 19:06:39.008781 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.008761 2567 scope.go:117] "RemoveContainer" containerID="d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d" Apr 16 19:06:39.016350 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.016324 2567 scope.go:117] "RemoveContainer" containerID="de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08" Apr 16 19:06:39.016601 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:06:39.016584 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08\": container with ID starting with de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08 not found: ID does not exist" containerID="de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08" Apr 16 19:06:39.016663 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.016609 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08"} err="failed to get container status \"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08\": rpc error: code = NotFound desc = could not find container \"de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08\": container with ID starting with de222d539e51872e80d03c96fb72e3d1e90c11bdc857ff0e508b7be245297b08 not found: ID does not exist" Apr 16 19:06:39.016663 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.016626 2567 scope.go:117] "RemoveContainer" containerID="d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d" Apr 16 19:06:39.016875 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:06:39.016857 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d\": container with ID starting with d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d not found: ID does not exist" containerID="d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d" Apr 16 19:06:39.016932 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.016885 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d"} err="failed to get container status \"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d\": rpc error: code = NotFound desc = could not find container \"d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d\": container with ID starting with d1254cd3b3f76e7b5f62803c3a192cc1e9986820c7711f37a95f0b67a4b7ab9d not found: ID does not exist" Apr 16 19:06:39.022542 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.022518 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:06:39.026196 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.026172 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kkqd9"] Apr 16 19:06:39.242523 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:39.242482 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" path="/var/lib/kubelet/pods/97c7fb9a-5ec3-49f4-ba30-de895847d8fb/volumes" Apr 16 19:06:47.996103 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:47.996026 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:06:57.996414 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:06:57.996366 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:07:07.996078 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:07.996009 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:07:17.996645 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:17.996604 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:07:27.996682 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:27.996573 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 19:07:37.997239 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:37.997209 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:07:41.219170 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.219139 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:07:41.219656 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.219439 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" containerID="cri-o://f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c" gracePeriod=30 Apr 16 19:07:41.250902 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.250865 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:07:41.251355 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.251311 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="storage-initializer" Apr 16 19:07:41.251355 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.251335 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="storage-initializer" Apr 16 19:07:41.251355 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.251348 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" Apr 16 19:07:41.251355 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.251356 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" Apr 16 19:07:41.251637 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.251456 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="97c7fb9a-5ec3-49f4-ba30-de895847d8fb" containerName="kserve-container" Apr 16 19:07:41.254495 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.254472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:41.257365 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.257344 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:07:41.264854 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.264829 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:07:41.404676 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.404637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-q6d47\" (UID: \"056399fa-62bf-4fca-87da-40e2911b1d5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:41.505192 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.505091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-q6d47\" (UID: \"056399fa-62bf-4fca-87da-40e2911b1d5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:41.505514 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.505494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-85d8c7694d-q6d47\" (UID: \"056399fa-62bf-4fca-87da-40e2911b1d5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:41.566143 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.566103 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:41.723366 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:41.723337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:07:41.726035 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:07:41.726005 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056399fa_62bf_4fca_87da_40e2911b1d5b.slice/crio-d26bcd13b83c2a67c04bdc387b155c4b54960c9ca48c01d360ec26468edfeaee WatchSource:0}: Error finding container d26bcd13b83c2a67c04bdc387b155c4b54960c9ca48c01d360ec26468edfeaee: Status 404 returned error can't find the container with id d26bcd13b83c2a67c04bdc387b155c4b54960c9ca48c01d360ec26468edfeaee Apr 16 19:07:42.218166 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:42.218120 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerStarted","Data":"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412"} Apr 16 19:07:42.218166 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:42.218171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerStarted","Data":"d26bcd13b83c2a67c04bdc387b155c4b54960c9ca48c01d360ec26468edfeaee"} Apr 16 19:07:43.222863 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:43.222832 2567 generic.go:358] "Generic (PLEG): container finished" podID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerID="7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412" exitCode=0 Apr 16 19:07:43.223311 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:43.222890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerDied","Data":"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412"} Apr 16 19:07:44.227715 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:44.227676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerStarted","Data":"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15"} Apr 16 19:07:44.228133 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:44.227903 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:07:44.229161 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:44.229133 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:07:44.245285 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:44.245225 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podStartSLOduration=3.245211121 podStartE2EDuration="3.245211121s" podCreationTimestamp="2026-04-16 19:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:07:44.243066739 +0000 UTC m=+3425.626730210" watchObservedRunningTime="2026-04-16 19:07:44.245211121 +0000 UTC m=+3425.628874570" Apr 16 19:07:44.966104 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:44.966075 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:07:45.134975 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.134877 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location\") pod \"896a69c0-bfce-43a8-8155-9e6b08db3c25\" (UID: \"896a69c0-bfce-43a8-8155-9e6b08db3c25\") " Apr 16 19:07:45.135262 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.135237 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "896a69c0-bfce-43a8-8155-9e6b08db3c25" (UID: "896a69c0-bfce-43a8-8155-9e6b08db3c25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:07:45.232230 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232195 2567 generic.go:358] "Generic (PLEG): container finished" podID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerID="f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c" exitCode=0 Apr 16 19:07:45.232673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232260 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" Apr 16 19:07:45.232673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerDied","Data":"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c"} Apr 16 19:07:45.232673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232314 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz" event={"ID":"896a69c0-bfce-43a8-8155-9e6b08db3c25","Type":"ContainerDied","Data":"d63fd699ba3f1e21eb3511e2018fd2f9fc811bc87a476304255c30965d0a21f9"} Apr 16 19:07:45.232673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232330 2567 scope.go:117] "RemoveContainer" containerID="f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c" Apr 16 19:07:45.232975 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.232941 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:07:45.235610 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.235592 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/896a69c0-bfce-43a8-8155-9e6b08db3c25-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:07:45.241815 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.241788 2567 scope.go:117] "RemoveContainer" containerID="c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1" Apr 16 19:07:45.249134 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.249108 2567 scope.go:117] "RemoveContainer" containerID="f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c" Apr 16 19:07:45.249378 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:07:45.249359 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c\": container with ID starting with f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c not found: ID does not exist" containerID="f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c" Apr 16 19:07:45.249453 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.249390 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c"} err="failed to get container status \"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c\": rpc error: code = NotFound desc = could not find container \"f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c\": container with ID starting with f35eb7a6aa0b967bc08e9b0f9414c843ddac4b8299239ab73d6431ec617ddb4c not found: ID does not exist" Apr 16 19:07:45.249453 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.249415 2567 scope.go:117] "RemoveContainer" containerID="c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1" Apr 16 19:07:45.249665 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:07:45.249639 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1\": container with ID starting with c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1 not found: ID does not exist" containerID="c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1" Apr 16 19:07:45.249706 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.249673 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1"} err="failed to get container status \"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1\": rpc error: code = NotFound desc = could not find container \"c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1\": container with ID starting with c1544d9a5e882cb709ddfe828560aaf01f3a7d5ca68d80f4d45bb112213d59b1 not found: ID does not exist" Apr 16 19:07:45.254030 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.254007 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:07:45.256915 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:45.256895 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-drwvz"] Apr 16 19:07:47.241803 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:47.241771 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" path="/var/lib/kubelet/pods/896a69c0-bfce-43a8-8155-9e6b08db3c25/volumes" Apr 16 19:07:55.233738 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:07:55.233690 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:05.233188 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:05.233145 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:15.233919 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:15.233874 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:25.233738 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:25.233687 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:35.233058 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:35.233012 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:45.233955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:45.233912 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:08:52.238967 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:08:52.238939 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:09:01.380797 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.380696 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:09:01.381250 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.381085 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" containerID="cri-o://29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15" gracePeriod=30 Apr 16 19:09:01.490560 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.490525 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:09:01.491029 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.491014 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="storage-initializer" Apr 16 19:09:01.491091 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.491033 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="storage-initializer" Apr 16 19:09:01.491128 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.491088 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" Apr 16 19:09:01.491128 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.491098 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" Apr 16 19:09:01.491198 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.491180 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="896a69c0-bfce-43a8-8155-9e6b08db3c25" containerName="kserve-container" Apr 16 19:09:01.494628 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.494605 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.497272 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.497248 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:09:01.505717 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.505694 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:09:01.575750 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.575720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.575903 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.575782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.676925 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.676849 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.677092 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.676926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.677341 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.677321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.677493 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.677474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.806029 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.805992 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:01.923234 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:01.923210 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:09:01.925981 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:09:01.925951 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea30509f_1d91_4a34_aebe_c34d599ea7f7.slice/crio-7d6f03b556a0cdc0252900e9637a6eb3e1b1feaafb931d533ab4b1af980142d0 WatchSource:0}: Error finding container 7d6f03b556a0cdc0252900e9637a6eb3e1b1feaafb931d533ab4b1af980142d0: Status 404 returned error can't find the container with id 7d6f03b556a0cdc0252900e9637a6eb3e1b1feaafb931d533ab4b1af980142d0 Apr 16 19:09:02.238246 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:02.238198 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.73:8080: connect: connection refused" Apr 16 19:09:02.482699 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:02.482663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerStarted","Data":"f68bdaefc57120597031f2e079dbfaf0481fa22c06b9bd9bd05194cca235bac9"} Apr 16 19:09:02.482699 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:02.482704 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerStarted","Data":"7d6f03b556a0cdc0252900e9637a6eb3e1b1feaafb931d533ab4b1af980142d0"} Apr 16 19:09:03.487496 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:03.487460 2567 generic.go:358] "Generic (PLEG): container finished" podID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerID="f68bdaefc57120597031f2e079dbfaf0481fa22c06b9bd9bd05194cca235bac9" exitCode=0 Apr 16 19:09:03.487874 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:03.487547 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerDied","Data":"f68bdaefc57120597031f2e079dbfaf0481fa22c06b9bd9bd05194cca235bac9"} Apr 16 19:09:04.491981 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:04.491937 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerStarted","Data":"31c3af90abfe51181c9b4731eb10e58747d4f58b64233ac90eb9890b3e28fd87"} Apr 16 19:09:04.492409 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:04.492113 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:09:04.493434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:04.493407 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:04.511873 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:04.511820 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podStartSLOduration=3.511805146 podStartE2EDuration="3.511805146s" podCreationTimestamp="2026-04-16 19:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:09:04.509823005 +0000 UTC m=+3505.893486476" watchObservedRunningTime="2026-04-16 19:09:04.511805146 +0000 UTC m=+3505.895468617" Apr 16 19:09:05.495444 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:05.495407 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:06.026577 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.026554 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:09:06.116064 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.115958 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location\") pod \"056399fa-62bf-4fca-87da-40e2911b1d5b\" (UID: \"056399fa-62bf-4fca-87da-40e2911b1d5b\") " Apr 16 19:09:06.116299 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.116276 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "056399fa-62bf-4fca-87da-40e2911b1d5b" (UID: "056399fa-62bf-4fca-87da-40e2911b1d5b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:09:06.216642 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.216606 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/056399fa-62bf-4fca-87da-40e2911b1d5b-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:09:06.499971 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.499941 2567 generic.go:358] "Generic (PLEG): container finished" podID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerID="29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15" exitCode=0 Apr 16 19:09:06.500370 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.499993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerDied","Data":"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15"} Apr 16 19:09:06.500370 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.500006 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" Apr 16 19:09:06.500370 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.500019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47" event={"ID":"056399fa-62bf-4fca-87da-40e2911b1d5b","Type":"ContainerDied","Data":"d26bcd13b83c2a67c04bdc387b155c4b54960c9ca48c01d360ec26468edfeaee"} Apr 16 19:09:06.500370 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.500035 2567 scope.go:117] "RemoveContainer" containerID="29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15" Apr 16 19:09:06.507788 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.507769 2567 scope.go:117] "RemoveContainer" containerID="7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.514629 2567 scope.go:117] "RemoveContainer" containerID="29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:09:06.515391 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15\": container with ID starting with 29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15 not found: ID does not exist" containerID="29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.515423 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15"} err="failed to get container status \"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15\": rpc error: code = NotFound desc = could not find container \"29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15\": container with ID starting with 29b0be4e0b0443cb2139c73498e385370639f016582acddc82a30668dc3dea15 not found: ID does not exist" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.515460 2567 scope.go:117] "RemoveContainer" containerID="7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:09:06.515686 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412\": container with ID starting with 7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412 not found: ID does not exist" containerID="7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412" Apr 16 19:09:06.519102 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.515732 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412"} err="failed to get container status \"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412\": rpc error: code = NotFound desc = could not find container \"7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412\": container with ID starting with 7e62fcb0001d9756605cf01ec80878edcbe95dcaac1eef719f643960efc3f412 not found: ID does not exist" Apr 16 19:09:06.523503 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.523477 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:09:06.524927 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:06.524898 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-85d8c7694d-q6d47"] Apr 16 19:09:07.242811 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:07.242778 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" path="/var/lib/kubelet/pods/056399fa-62bf-4fca-87da-40e2911b1d5b/volumes" Apr 16 19:09:15.495945 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:15.495904 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:25.496253 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:25.496217 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:35.496053 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:35.495991 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:45.495946 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:45.495899 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:09:55.495622 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:09:55.495580 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:10:05.496374 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:05.496320 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:10:15.497081 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:15.497034 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:10:21.547466 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:21.547427 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:10:21.547932 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:21.547754 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" containerID="cri-o://31c3af90abfe51181c9b4731eb10e58747d4f58b64233ac90eb9890b3e28fd87" gracePeriod=30 Apr 16 19:10:22.614670 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.614630 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:22.615105 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.615007 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="storage-initializer" Apr 16 19:10:22.615105 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.615021 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="storage-initializer" Apr 16 19:10:22.615105 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.615034 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" Apr 16 19:10:22.615105 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.615057 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" Apr 16 19:10:22.615269 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.615121 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="056399fa-62bf-4fca-87da-40e2911b1d5b" containerName="kserve-container" Apr 16 19:10:22.618215 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.618199 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:22.624549 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.624520 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:22.761448 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.761404 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd\" (UID: \"648bcaca-5b28-4425-ab98-9b34393672a5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:22.862896 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.862862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd\" (UID: \"648bcaca-5b28-4425-ab98-9b34393672a5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:22.863242 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.863219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd\" (UID: \"648bcaca-5b28-4425-ab98-9b34393672a5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:22.930696 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:22.930609 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:23.056212 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:23.056154 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:23.058986 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:10:23.058960 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648bcaca_5b28_4425_ab98_9b34393672a5.slice/crio-197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a WatchSource:0}: Error finding container 197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a: Status 404 returned error can't find the container with id 197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a Apr 16 19:10:23.770569 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:23.770526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerStarted","Data":"1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25"} Apr 16 19:10:23.770569 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:23.770575 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerStarted","Data":"197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a"} Apr 16 19:10:24.775051 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:24.775025 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/0.log" Apr 16 19:10:24.775416 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:24.775088 2567 generic.go:358] "Generic (PLEG): container finished" podID="648bcaca-5b28-4425-ab98-9b34393672a5" containerID="1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25" exitCode=1 Apr 16 19:10:24.775416 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:24.775138 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerDied","Data":"1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25"} Apr 16 19:10:25.495785 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.495745 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 19:10:25.781393 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.781362 2567 generic.go:358] "Generic (PLEG): container finished" podID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerID="31c3af90abfe51181c9b4731eb10e58747d4f58b64233ac90eb9890b3e28fd87" exitCode=0 Apr 16 19:10:25.781753 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.781425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerDied","Data":"31c3af90abfe51181c9b4731eb10e58747d4f58b64233ac90eb9890b3e28fd87"} Apr 16 19:10:25.783195 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.783176 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/0.log" Apr 16 19:10:25.783306 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.783216 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerStarted","Data":"949a563d760d8768faf22c7ffc45dad835f38d749f764c76e7a97ac19ed80731"} Apr 16 19:10:25.896525 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.896505 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:10:25.989967 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.989934 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location\") pod \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " Apr 16 19:10:25.990145 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.990001 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert\") pod \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\" (UID: \"ea30509f-1d91-4a34-aebe-c34d599ea7f7\") " Apr 16 19:10:25.990258 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.990234 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea30509f-1d91-4a34-aebe-c34d599ea7f7" (UID: "ea30509f-1d91-4a34-aebe-c34d599ea7f7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:10:25.990337 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:25.990319 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ea30509f-1d91-4a34-aebe-c34d599ea7f7" (UID: "ea30509f-1d91-4a34-aebe-c34d599ea7f7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:10:26.090796 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.090716 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea30509f-1d91-4a34-aebe-c34d599ea7f7-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:10:26.090796 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.090747 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea30509f-1d91-4a34-aebe-c34d599ea7f7-cabundle-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:10:26.787592 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.787564 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" Apr 16 19:10:26.788033 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.787589 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg" event={"ID":"ea30509f-1d91-4a34-aebe-c34d599ea7f7","Type":"ContainerDied","Data":"7d6f03b556a0cdc0252900e9637a6eb3e1b1feaafb931d533ab4b1af980142d0"} Apr 16 19:10:26.788033 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.787639 2567 scope.go:117] "RemoveContainer" containerID="31c3af90abfe51181c9b4731eb10e58747d4f58b64233ac90eb9890b3e28fd87" Apr 16 19:10:26.795897 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.795881 2567 scope.go:117] "RemoveContainer" containerID="f68bdaefc57120597031f2e079dbfaf0481fa22c06b9bd9bd05194cca235bac9" Apr 16 19:10:26.807844 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.807821 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:10:26.813703 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:26.813678 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-687fddf9b-t58jg"] Apr 16 19:10:27.242176 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.242144 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" path="/var/lib/kubelet/pods/ea30509f-1d91-4a34-aebe-c34d599ea7f7/volumes" Apr 16 19:10:27.792083 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792034 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/1.log" Apr 16 19:10:27.792485 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792371 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/0.log" Apr 16 19:10:27.792485 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792402 2567 generic.go:358] "Generic (PLEG): container finished" podID="648bcaca-5b28-4425-ab98-9b34393672a5" containerID="949a563d760d8768faf22c7ffc45dad835f38d749f764c76e7a97ac19ed80731" exitCode=1 Apr 16 19:10:27.792485 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792460 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerDied","Data":"949a563d760d8768faf22c7ffc45dad835f38d749f764c76e7a97ac19ed80731"} Apr 16 19:10:27.792610 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792493 2567 scope.go:117] "RemoveContainer" containerID="1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25" Apr 16 19:10:27.792854 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.792831 2567 scope.go:117] "RemoveContainer" containerID="1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25" Apr 16 19:10:27.802815 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:10:27.802780 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_kserve-ci-e2e-test_648bcaca-5b28-4425-ab98-9b34393672a5_0 in pod sandbox 197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a from index: no such id: '1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25'" containerID="1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25" Apr 16 19:10:27.802912 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:27.802826 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_kserve-ci-e2e-test_648bcaca-5b28-4425-ab98-9b34393672a5_0 in pod sandbox 197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a from index: no such id: '1224075c486d1457ec632ec9ab8477827723c4665abe5b5494f0ee75b73a2f25'" Apr 16 19:10:27.802960 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:10:27.802926 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_kserve-ci-e2e-test(648bcaca-5b28-4425-ab98-9b34393672a5)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" Apr 16 19:10:28.797616 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:28.797585 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/1.log" Apr 16 19:10:32.618911 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.618872 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:32.752316 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.752294 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/1.log" Apr 16 19:10:32.752436 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.752352 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:32.811524 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.811451 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd_648bcaca-5b28-4425-ab98-9b34393672a5/storage-initializer/1.log" Apr 16 19:10:32.811659 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.811572 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" Apr 16 19:10:32.811659 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.811584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd" event={"ID":"648bcaca-5b28-4425-ab98-9b34393672a5","Type":"ContainerDied","Data":"197b203ad7d18c82f63e66671eea67ffcc0c5cfa0c0feea96153cf626a2dd28a"} Apr 16 19:10:32.811659 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.811630 2567 scope.go:117] "RemoveContainer" containerID="949a563d760d8768faf22c7ffc45dad835f38d749f764c76e7a97ac19ed80731" Apr 16 19:10:32.844425 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.844400 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location\") pod \"648bcaca-5b28-4425-ab98-9b34393672a5\" (UID: \"648bcaca-5b28-4425-ab98-9b34393672a5\") " Apr 16 19:10:32.844683 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.844660 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "648bcaca-5b28-4425-ab98-9b34393672a5" (UID: "648bcaca-5b28-4425-ab98-9b34393672a5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:10:32.945937 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:32.945900 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/648bcaca-5b28-4425-ab98-9b34393672a5-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:10:33.145520 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.145480 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:33.149424 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.149400 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6d4bd66fcd-szsjd"] Apr 16 19:10:33.242130 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.242099 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" path="/var/lib/kubelet/pods/648bcaca-5b28-4425-ab98-9b34393672a5/volumes" Apr 16 19:10:33.696533 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.696500 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:10:33.697016 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.696994 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" Apr 16 19:10:33.697151 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697022 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" Apr 16 19:10:33.697151 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697053 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.697151 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697063 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.697151 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697075 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="storage-initializer" Apr 16 19:10:33.697151 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697084 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="storage-initializer" Apr 16 19:10:33.697428 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697230 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.697428 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697249 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.697428 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697263 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea30509f-1d91-4a34-aebe-c34d599ea7f7" containerName="kserve-container" Apr 16 19:10:33.697428 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697376 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.697428 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.697387 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="648bcaca-5b28-4425-ab98-9b34393672a5" containerName="storage-initializer" Apr 16 19:10:33.701821 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.701800 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.704502 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.704480 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:10:33.704618 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.704543 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:10:33.704673 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.704627 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ww52g\"" Apr 16 19:10:33.708499 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.708474 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:10:33.854653 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.854619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.854808 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.854669 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.955661 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.955560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.955661 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.955610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.955969 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.955942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:33.956294 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:33.956271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:34.014747 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:34.014708 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:34.143358 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:34.143327 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:10:34.145585 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:10:34.145557 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf397f650_022a_410a_ac92_37efe6a820ef.slice/crio-4fbc7c153b5ec9d16351ec581c27bbb1785bcf7194de96085284d87bee5db16c WatchSource:0}: Error finding container 4fbc7c153b5ec9d16351ec581c27bbb1785bcf7194de96085284d87bee5db16c: Status 404 returned error can't find the container with id 4fbc7c153b5ec9d16351ec581c27bbb1785bcf7194de96085284d87bee5db16c Apr 16 19:10:34.820313 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:34.820265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerStarted","Data":"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a"} Apr 16 19:10:34.820685 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:34.820318 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerStarted","Data":"4fbc7c153b5ec9d16351ec581c27bbb1785bcf7194de96085284d87bee5db16c"} Apr 16 19:10:35.825404 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:35.825366 2567 generic.go:358] "Generic (PLEG): container finished" podID="f397f650-022a-410a-ac92-37efe6a820ef" containerID="0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a" exitCode=0 Apr 16 19:10:35.825839 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:35.825445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerDied","Data":"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a"} Apr 16 19:10:36.830277 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:36.830244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerStarted","Data":"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6"} Apr 16 19:10:36.830764 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:36.830466 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:10:36.831609 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:36.831584 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:10:36.846826 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:36.846775 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podStartSLOduration=3.846757081 podStartE2EDuration="3.846757081s" podCreationTimestamp="2026-04-16 19:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:10:36.845790316 +0000 UTC m=+3598.229453825" watchObservedRunningTime="2026-04-16 19:10:36.846757081 +0000 UTC m=+3598.230420555" Apr 16 19:10:37.834096 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:37.834058 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:10:47.834611 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:47.834568 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:10:57.834699 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:10:57.834659 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:07.835061 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:07.835003 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:17.834955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:17.834911 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:27.834459 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:27.834335 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:37.835018 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:37.834975 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:47.835923 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:47.835889 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:11:53.729385 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:53.729345 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:11:53.729854 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:53.729717 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" containerID="cri-o://14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6" gracePeriod=30 Apr 16 19:11:54.800613 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.800579 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:11:54.803952 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.803926 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:11:54.812223 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.812195 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:11:54.829688 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.829657 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn\" (UID: \"e04246c0-c34c-4755-8b2b-5ae73113bf30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:11:54.930573 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.930533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn\" (UID: \"e04246c0-c34c-4755-8b2b-5ae73113bf30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:11:54.930920 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:54.930899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn\" (UID: \"e04246c0-c34c-4755-8b2b-5ae73113bf30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:11:55.115536 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:55.115446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:11:55.243181 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:55.243152 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:11:55.243330 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:11:55.243204 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04246c0_c34c_4755_8b2b_5ae73113bf30.slice/crio-e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a WatchSource:0}: Error finding container e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a: Status 404 returned error can't find the container with id e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a Apr 16 19:11:55.245505 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:55.245476 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:11:56.093072 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:56.092982 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerStarted","Data":"f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a"} Apr 16 19:11:56.093072 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:56.093018 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerStarted","Data":"e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a"} Apr 16 19:11:57.834841 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:57.834800 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.76:8080: connect: connection refused" Apr 16 19:11:58.076914 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.076892 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:11:58.100551 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.100478 2567 generic.go:358] "Generic (PLEG): container finished" podID="f397f650-022a-410a-ac92-37efe6a820ef" containerID="14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6" exitCode=0 Apr 16 19:11:58.100551 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.100530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerDied","Data":"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6"} Apr 16 19:11:58.100706 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.100556 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" event={"ID":"f397f650-022a-410a-ac92-37efe6a820ef","Type":"ContainerDied","Data":"4fbc7c153b5ec9d16351ec581c27bbb1785bcf7194de96085284d87bee5db16c"} Apr 16 19:11:58.100706 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.100560 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs" Apr 16 19:11:58.100706 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.100570 2567 scope.go:117] "RemoveContainer" containerID="14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6" Apr 16 19:11:58.108084 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.108066 2567 scope.go:117] "RemoveContainer" containerID="0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a" Apr 16 19:11:58.115510 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.115492 2567 scope.go:117] "RemoveContainer" containerID="14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6" Apr 16 19:11:58.115781 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:11:58.115756 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6\": container with ID starting with 14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6 not found: ID does not exist" containerID="14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6" Apr 16 19:11:58.115871 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.115788 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6"} err="failed to get container status \"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6\": rpc error: code = NotFound desc = could not find container \"14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6\": container with ID starting with 14e26b4b410a1b272bed1c9190c67bca44fed73565cd4cea337c92e741eba3b6 not found: ID does not exist" Apr 16 19:11:58.115871 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.115804 2567 scope.go:117] "RemoveContainer" containerID="0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a" Apr 16 19:11:58.116099 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:11:58.116072 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a\": container with ID starting with 0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a not found: ID does not exist" containerID="0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a" Apr 16 19:11:58.116155 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.116108 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a"} err="failed to get container status \"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a\": rpc error: code = NotFound desc = could not find container \"0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a\": container with ID starting with 0e264b285d5cd5db189821ae4ed42508df2dfd22dddeb4782671650bbb5d944a not found: ID does not exist" Apr 16 19:11:58.156977 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.156948 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location\") pod \"f397f650-022a-410a-ac92-37efe6a820ef\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " Apr 16 19:11:58.157122 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.157027 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert\") pod \"f397f650-022a-410a-ac92-37efe6a820ef\" (UID: \"f397f650-022a-410a-ac92-37efe6a820ef\") " Apr 16 19:11:58.157257 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.157237 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f397f650-022a-410a-ac92-37efe6a820ef" (UID: "f397f650-022a-410a-ac92-37efe6a820ef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:11:58.157330 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.157311 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f397f650-022a-410a-ac92-37efe6a820ef" (UID: "f397f650-022a-410a-ac92-37efe6a820ef"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:11:58.257764 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.257732 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f397f650-022a-410a-ac92-37efe6a820ef-cabundle-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:11:58.257764 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.257757 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f397f650-022a-410a-ac92-37efe6a820ef-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:11:58.423400 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.423367 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:11:58.425009 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:58.424988 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-58ff675ffb-pfjjs"] Apr 16 19:11:59.244026 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:11:59.243985 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f397f650-022a-410a-ac92-37efe6a820ef" path="/var/lib/kubelet/pods/f397f650-022a-410a-ac92-37efe6a820ef/volumes" Apr 16 19:12:00.109184 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:00.109110 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/0.log" Apr 16 19:12:00.109184 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:00.109148 2567 generic.go:358] "Generic (PLEG): container finished" podID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerID="f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a" exitCode=1 Apr 16 19:12:00.109371 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:00.109230 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerDied","Data":"f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a"} Apr 16 19:12:01.113715 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:01.113686 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/0.log" Apr 16 19:12:01.114213 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:01.113769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerStarted","Data":"ac0e37f0e0bd4a67a05c1ba591c06278211451169405f7033ab6a0c3cd5d6c00"} Apr 16 19:12:03.120976 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.120903 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/1.log" Apr 16 19:12:03.121365 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.121258 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/0.log" Apr 16 19:12:03.121365 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.121289 2567 generic.go:358] "Generic (PLEG): container finished" podID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerID="ac0e37f0e0bd4a67a05c1ba591c06278211451169405f7033ab6a0c3cd5d6c00" exitCode=1 Apr 16 19:12:03.121365 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.121317 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerDied","Data":"ac0e37f0e0bd4a67a05c1ba591c06278211451169405f7033ab6a0c3cd5d6c00"} Apr 16 19:12:03.121365 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.121343 2567 scope.go:117] "RemoveContainer" containerID="f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a" Apr 16 19:12:03.121704 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:03.121683 2567 scope.go:117] "RemoveContainer" containerID="f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a" Apr 16 19:12:03.135512 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:12:03.135474 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_kserve-ci-e2e-test_e04246c0-c34c-4755-8b2b-5ae73113bf30_0 in pod sandbox e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a from index: no such id: 'f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a'" containerID="f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a" Apr 16 19:12:03.135575 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:12:03.135534 2567 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_kserve-ci-e2e-test_e04246c0-c34c-4755-8b2b-5ae73113bf30_0 in pod sandbox e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a from index: no such id: 'f0344307f347cb7c8a3d81dca0e8947aeff5900513eb64d5cf0f1c2a0f5b3e8a'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_kserve-ci-e2e-test(e04246c0-c34c-4755-8b2b-5ae73113bf30)\"" logger="UnhandledError" Apr 16 19:12:03.136853 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:12:03.136832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_kserve-ci-e2e-test(e04246c0-c34c-4755-8b2b-5ae73113bf30)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" Apr 16 19:12:04.126403 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:04.126375 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/1.log" Apr 16 19:12:04.795753 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:04.795718 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:12:04.934430 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:04.934404 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/1.log" Apr 16 19:12:04.934557 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:04.934465 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:12:05.015511 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.015476 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location\") pod \"e04246c0-c34c-4755-8b2b-5ae73113bf30\" (UID: \"e04246c0-c34c-4755-8b2b-5ae73113bf30\") " Apr 16 19:12:05.015776 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.015755 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e04246c0-c34c-4755-8b2b-5ae73113bf30" (UID: "e04246c0-c34c-4755-8b2b-5ae73113bf30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:12:05.116259 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.116177 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e04246c0-c34c-4755-8b2b-5ae73113bf30-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:12:05.130742 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.130722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn_e04246c0-c34c-4755-8b2b-5ae73113bf30/storage-initializer/1.log" Apr 16 19:12:05.131136 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.130799 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" event={"ID":"e04246c0-c34c-4755-8b2b-5ae73113bf30","Type":"ContainerDied","Data":"e01b26f2e4dc7230640f79c9c81681d129035fcb37c448497c79a653756aaa9a"} Apr 16 19:12:05.131136 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.130827 2567 scope.go:117] "RemoveContainer" containerID="ac0e37f0e0bd4a67a05c1ba591c06278211451169405f7033ab6a0c3cd5d6c00" Apr 16 19:12:05.131136 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.130833 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn" Apr 16 19:12:05.165185 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.165156 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:12:05.166866 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.166844 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-64b6c79678-w7hzn"] Apr 16 19:12:05.241798 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.241768 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" path="/var/lib/kubelet/pods/e04246c0-c34c-4755-8b2b-5ae73113bf30/volumes" Apr 16 19:12:05.898930 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.898894 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:12:05.899236 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899223 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899237 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899247 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899253 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899266 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="storage-initializer" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899272 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="storage-initializer" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899279 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.899307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899284 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.899666 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899343 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f397f650-022a-410a-ac92-37efe6a820ef" containerName="kserve-container" Apr 16 19:12:05.899666 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899361 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.899666 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.899372 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e04246c0-c34c-4755-8b2b-5ae73113bf30" containerName="storage-initializer" Apr 16 19:12:05.904404 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.904384 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:05.908088 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.908032 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:12:05.911557 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.911538 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:12:05.913849 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.913829 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ww52g\"" Apr 16 19:12:05.923493 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.923470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:05.923584 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.923512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:05.931206 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:05.931182 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:12:06.024230 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.024183 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:06.024411 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.024246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:06.024620 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.024597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:06.024847 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.024829 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:06.214700 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.214661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:06.332225 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:06.332191 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:12:06.335365 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:12:06.335336 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12f73b9_2368_4d92_a36e_adc63facb7d8.slice/crio-da0353d8b160b849a8ef52ad01a8a85c41b3d9df0b3cb3c2219b3a492167cf3a WatchSource:0}: Error finding container da0353d8b160b849a8ef52ad01a8a85c41b3d9df0b3cb3c2219b3a492167cf3a: Status 404 returned error can't find the container with id da0353d8b160b849a8ef52ad01a8a85c41b3d9df0b3cb3c2219b3a492167cf3a Apr 16 19:12:07.140920 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:07.140883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerStarted","Data":"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc"} Apr 16 19:12:07.140920 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:07.140923 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerStarted","Data":"da0353d8b160b849a8ef52ad01a8a85c41b3d9df0b3cb3c2219b3a492167cf3a"} Apr 16 19:12:08.145162 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:08.145127 2567 generic.go:358] "Generic (PLEG): container finished" podID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerID="0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc" exitCode=0 Apr 16 19:12:08.145550 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:08.145199 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerDied","Data":"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc"} Apr 16 19:12:09.150364 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:09.150318 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerStarted","Data":"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92"} Apr 16 19:12:09.150938 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:09.150457 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:12:09.151709 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:09.151684 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:12:09.168137 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:09.168091 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podStartSLOduration=4.1680762399999995 podStartE2EDuration="4.16807624s" podCreationTimestamp="2026-04-16 19:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:12:09.166341014 +0000 UTC m=+3690.550004486" watchObservedRunningTime="2026-04-16 19:12:09.16807624 +0000 UTC m=+3690.551739711" Apr 16 19:12:10.153879 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:10.153839 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:12:20.154237 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:20.154196 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:12:30.154244 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:30.154193 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:12:40.153964 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:40.153921 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:12:50.154700 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:12:50.154657 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:13:00.153995 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:00.153954 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:13:10.154737 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:10.154693 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:13:20.155233 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:20.155200 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:13:25.893001 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:25.892922 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:13:25.893454 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:25.893267 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" containerID="cri-o://2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92" gracePeriod=30 Apr 16 19:13:26.956384 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:26.956338 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:26.959959 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:26.959936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:26.970969 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:26.970945 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:27.115337 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:27.115296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t\" (UID: \"7f924952-e6ef-4453-a06c-6fc17d5fcbe4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:27.216935 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:27.216839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t\" (UID: \"7f924952-e6ef-4453-a06c-6fc17d5fcbe4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:27.217362 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:27.217337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t\" (UID: \"7f924952-e6ef-4453-a06c-6fc17d5fcbe4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:27.273879 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:27.273839 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:27.409554 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:27.409528 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:27.412766 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:13:27.412738 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f924952_e6ef_4453_a06c_6fc17d5fcbe4.slice/crio-aaaafed883e7122020bb650b09cacdfd4c5acecfef6b42b6638dcaabb3f60148 WatchSource:0}: Error finding container aaaafed883e7122020bb650b09cacdfd4c5acecfef6b42b6638dcaabb3f60148: Status 404 returned error can't find the container with id aaaafed883e7122020bb650b09cacdfd4c5acecfef6b42b6638dcaabb3f60148 Apr 16 19:13:28.410247 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:28.410215 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerStarted","Data":"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351"} Apr 16 19:13:28.410247 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:28.410249 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerStarted","Data":"aaaafed883e7122020bb650b09cacdfd4c5acecfef6b42b6638dcaabb3f60148"} Apr 16 19:13:30.154847 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.154801 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.78:8080: connect: connection refused" Apr 16 19:13:30.239172 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.239149 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:13:30.242918 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.242889 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location\") pod \"e12f73b9-2368-4d92-a36e-adc63facb7d8\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " Apr 16 19:13:30.243028 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.242979 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert\") pod \"e12f73b9-2368-4d92-a36e-adc63facb7d8\" (UID: \"e12f73b9-2368-4d92-a36e-adc63facb7d8\") " Apr 16 19:13:30.243247 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.243229 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e12f73b9-2368-4d92-a36e-adc63facb7d8" (UID: "e12f73b9-2368-4d92-a36e-adc63facb7d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:13:30.243362 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.243339 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "e12f73b9-2368-4d92-a36e-adc63facb7d8" (UID: "e12f73b9-2368-4d92-a36e-adc63facb7d8"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:13:30.344257 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.344167 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e12f73b9-2368-4d92-a36e-adc63facb7d8-cabundle-cert\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:13:30.344257 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.344198 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e12f73b9-2368-4d92-a36e-adc63facb7d8-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:13:30.419256 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.419227 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/0.log" Apr 16 19:13:30.419412 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.419266 2567 generic.go:358] "Generic (PLEG): container finished" podID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerID="f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351" exitCode=1 Apr 16 19:13:30.419412 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.419341 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerDied","Data":"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351"} Apr 16 19:13:30.420893 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.420861 2567 generic.go:358] "Generic (PLEG): container finished" podID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerID="2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92" exitCode=0 Apr 16 19:13:30.421007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.420890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerDied","Data":"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92"} Apr 16 19:13:30.421007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.420927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" event={"ID":"e12f73b9-2368-4d92-a36e-adc63facb7d8","Type":"ContainerDied","Data":"da0353d8b160b849a8ef52ad01a8a85c41b3d9df0b3cb3c2219b3a492167cf3a"} Apr 16 19:13:30.421007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.420928 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw" Apr 16 19:13:30.421007 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.420941 2567 scope.go:117] "RemoveContainer" containerID="2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92" Apr 16 19:13:30.429293 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.429188 2567 scope.go:117] "RemoveContainer" containerID="0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc" Apr 16 19:13:30.437099 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.436961 2567 scope.go:117] "RemoveContainer" containerID="2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92" Apr 16 19:13:30.437292 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:13:30.437275 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92\": container with ID starting with 2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92 not found: ID does not exist" containerID="2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92" Apr 16 19:13:30.437341 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.437301 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92"} err="failed to get container status \"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92\": rpc error: code = NotFound desc = could not find container \"2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92\": container with ID starting with 2939c9b2578cd3fa105550ea28b01c5ea98717f9eb86ad9fc784d514172acf92 not found: ID does not exist" Apr 16 19:13:30.437341 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.437318 2567 scope.go:117] "RemoveContainer" containerID="0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc" Apr 16 19:13:30.437608 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:13:30.437570 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc\": container with ID starting with 0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc not found: ID does not exist" containerID="0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc" Apr 16 19:13:30.437680 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.437605 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc"} err="failed to get container status \"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc\": rpc error: code = NotFound desc = could not find container \"0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc\": container with ID starting with 0943ee5d6303f65526f3e3848a3ee17a623504d53aefa0e3c8e1b0765ef5affc not found: ID does not exist" Apr 16 19:13:30.446942 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.446921 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:13:30.450732 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:30.450710 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-768799b9b5-76dfw"] Apr 16 19:13:31.241426 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:31.241395 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" path="/var/lib/kubelet/pods/e12f73b9-2368-4d92-a36e-adc63facb7d8/volumes" Apr 16 19:13:31.425622 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:31.425593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/0.log" Apr 16 19:13:31.425811 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:31.425719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerStarted","Data":"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d"} Apr 16 19:13:36.966250 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:36.966215 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:36.966713 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:36.966459 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerName="storage-initializer" containerID="cri-o://482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d" gracePeriod=30 Apr 16 19:13:37.092938 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.092916 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/1.log" Apr 16 19:13:37.093307 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.093291 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/0.log" Apr 16 19:13:37.093378 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.093348 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:37.196617 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.196590 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location\") pod \"7f924952-e6ef-4453-a06c-6fc17d5fcbe4\" (UID: \"7f924952-e6ef-4453-a06c-6fc17d5fcbe4\") " Apr 16 19:13:37.196888 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.196866 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f924952-e6ef-4453-a06c-6fc17d5fcbe4" (UID: "7f924952-e6ef-4453-a06c-6fc17d5fcbe4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:13:37.297882 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.297848 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f924952-e6ef-4453-a06c-6fc17d5fcbe4-kserve-provision-location\") on node \"ip-10-0-141-192.ec2.internal\" DevicePath \"\"" Apr 16 19:13:37.447728 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.447698 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/1.log" Apr 16 19:13:37.448099 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448081 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t_7f924952-e6ef-4453-a06c-6fc17d5fcbe4/storage-initializer/0.log" Apr 16 19:13:37.448163 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448120 2567 generic.go:358] "Generic (PLEG): container finished" podID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerID="482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d" exitCode=1 Apr 16 19:13:37.448163 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerDied","Data":"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d"} Apr 16 19:13:37.448232 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" event={"ID":"7f924952-e6ef-4453-a06c-6fc17d5fcbe4","Type":"ContainerDied","Data":"aaaafed883e7122020bb650b09cacdfd4c5acecfef6b42b6638dcaabb3f60148"} Apr 16 19:13:37.448232 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448189 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t" Apr 16 19:13:37.448296 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.448191 2567 scope.go:117] "RemoveContainer" containerID="482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d" Apr 16 19:13:37.456033 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.456003 2567 scope.go:117] "RemoveContainer" containerID="f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351" Apr 16 19:13:37.462756 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.462737 2567 scope.go:117] "RemoveContainer" containerID="482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d" Apr 16 19:13:37.463005 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:13:37.462984 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d\": container with ID starting with 482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d not found: ID does not exist" containerID="482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d" Apr 16 19:13:37.463110 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.463019 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d"} err="failed to get container status \"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d\": rpc error: code = NotFound desc = could not find container \"482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d\": container with ID starting with 482fcf5d5f43cf07c672ae536d06acef188b2723ec36b4c3e77b75b53db0e99d not found: ID does not exist" Apr 16 19:13:37.463110 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.463063 2567 scope.go:117] "RemoveContainer" containerID="f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351" Apr 16 19:13:37.463314 ip-10-0-141-192 kubenswrapper[2567]: E0416 19:13:37.463276 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351\": container with ID starting with f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351 not found: ID does not exist" containerID="f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351" Apr 16 19:13:37.463353 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.463320 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351"} err="failed to get container status \"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351\": rpc error: code = NotFound desc = could not find container \"f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351\": container with ID starting with f1c3081f08f58e4f1345f3cdb137d8335865026028131ad848ce2277c0250351 not found: ID does not exist" Apr 16 19:13:37.476518 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.476496 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:37.479994 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:37.479974 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b4c6f7c79-8mf8t"] Apr 16 19:13:39.242705 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:13:39.242667 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" path="/var/lib/kubelet/pods/7f924952-e6ef-4453-a06c-6fc17d5fcbe4/volumes" Apr 16 19:14:07.222435 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:07.222406 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xx8pm_3544bf39-6ebd-4771-b6fb-1c8d17fcabdb/global-pull-secret-syncer/0.log" Apr 16 19:14:07.271161 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:07.271128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bdf4x_a423aaa2-c387-4ccb-a9ca-a627b634154d/konnectivity-agent/0.log" Apr 16 19:14:07.387000 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:07.386975 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-192.ec2.internal_3b61c38e01c9c0b2139dda9e09bff1f5/haproxy/0.log" Apr 16 19:14:11.078111 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.078076 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-q2xwd_a485b132-000d-45ba-814d-0a8ae8aa60b1/kube-state-metrics/0.log" Apr 16 19:14:11.101517 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.101486 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-q2xwd_a485b132-000d-45ba-814d-0a8ae8aa60b1/kube-rbac-proxy-main/0.log" Apr 16 19:14:11.126031 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.126004 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-q2xwd_a485b132-000d-45ba-814d-0a8ae8aa60b1/kube-rbac-proxy-self/0.log" Apr 16 19:14:11.161075 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.161033 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c8979ff68-w29nx_3cd4878e-7041-4bf3-97b2-ff7307a91899/metrics-server/0.log" Apr 16 19:14:11.363679 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.363602 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh7nj_c788c166-fe22-4cdc-919e-d0d5a8ac872f/node-exporter/0.log" Apr 16 19:14:11.385861 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.385836 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh7nj_c788c166-fe22-4cdc-919e-d0d5a8ac872f/kube-rbac-proxy/0.log" Apr 16 19:14:11.409427 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.409403 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh7nj_c788c166-fe22-4cdc-919e-d0d5a8ac872f/init-textfile/0.log" Apr 16 19:14:11.437161 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.437131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-s8k28_15bb2e52-9699-4655-905c-281a6c94f097/kube-rbac-proxy-main/0.log" Apr 16 19:14:11.460876 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.460846 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-s8k28_15bb2e52-9699-4655-905c-281a6c94f097/kube-rbac-proxy-self/0.log" Apr 16 19:14:11.482920 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:11.482894 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-s8k28_15bb2e52-9699-4655-905c-281a6c94f097/openshift-state-metrics/0.log" Apr 16 19:14:13.096703 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:13.096658 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-6fb2m_64f89b21-4ff4-4616-8331-df624059595f/networking-console-plugin/0.log" Apr 16 19:14:13.832697 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:13.832671 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-688684b878-lqnzv_41f0d5a2-a14e-43d3-92d4-f5a2eac53408/console/0.log" Apr 16 19:14:13.866982 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:13.866954 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-b7ts9_7a8a2626-153f-4555-84e3-6c7d86f5db58/download-server/0.log" Apr 16 19:14:14.085074 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.084972 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw"] Apr 16 19:14:14.085387 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085372 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerName="storage-initializer" Apr 16 19:14:14.085434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085389 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerName="storage-initializer" Apr 16 19:14:14.085434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085405 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="storage-initializer" Apr 16 19:14:14.085434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085410 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="storage-initializer" Apr 16 19:14:14.085434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085421 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" Apr 16 19:14:14.085434 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085426 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" Apr 16 19:14:14.085627 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085483 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e12f73b9-2368-4d92-a36e-adc63facb7d8" containerName="kserve-container" Apr 16 19:14:14.085627 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085493 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerName="storage-initializer" Apr 16 19:14:14.085627 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.085499 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f924952-e6ef-4453-a06c-6fc17d5fcbe4" containerName="storage-initializer" Apr 16 19:14:14.088603 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.088583 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.091495 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.091476 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"kube-root-ca.crt\"" Apr 16 19:14:14.092492 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.092469 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p86h9\"/\"default-dockercfg-j49c7\"" Apr 16 19:14:14.092492 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.092486 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"openshift-service-ca.crt\"" Apr 16 19:14:14.100417 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.100393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw"] Apr 16 19:14:14.217865 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.217828 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qp5\" (UniqueName: \"kubernetes.io/projected/04d154f6-0c2d-40a4-bf95-6115da4369ff-kube-api-access-r7qp5\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.218038 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.217891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-lib-modules\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.218038 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.217913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-podres\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.218038 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.217940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-proc\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.218038 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.218007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-sys\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.268643 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.268615 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-6sggh_ddccaae8-9c6a-4a8c-8840-86e365b52d47/volume-data-source-validator/0.log" Apr 16 19:14:14.319578 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-lib-modules\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319785 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-podres\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319785 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-proc\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319785 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319712 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-lib-modules\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-sys\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319866 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-podres\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-proc\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qp5\" (UniqueName: \"kubernetes.io/projected/04d154f6-0c2d-40a4-bf95-6115da4369ff-kube-api-access-r7qp5\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.319955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.319938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d154f6-0c2d-40a4-bf95-6115da4369ff-sys\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.329759 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.329731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qp5\" (UniqueName: \"kubernetes.io/projected/04d154f6-0c2d-40a4-bf95-6115da4369ff-kube-api-access-r7qp5\") pod \"perf-node-gather-daemonset-27brw\" (UID: \"04d154f6-0c2d-40a4-bf95-6115da4369ff\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.398459 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.398378 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:14.723699 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:14.723674 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw"] Apr 16 19:14:14.725701 ip-10-0-141-192 kubenswrapper[2567]: W0416 19:14:14.725669 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04d154f6_0c2d_40a4_bf95_6115da4369ff.slice/crio-0e7146e103232747837d0dd83c6827ca27c2eab637687661317a3cba58ffd620 WatchSource:0}: Error finding container 0e7146e103232747837d0dd83c6827ca27c2eab637687661317a3cba58ffd620: Status 404 returned error can't find the container with id 0e7146e103232747837d0dd83c6827ca27c2eab637687661317a3cba58ffd620 Apr 16 19:14:15.022601 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.022529 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcq4f_6b2f5246-4971-4502-a370-da270e3fd3dc/dns/0.log" Apr 16 19:14:15.041919 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.041894 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcq4f_6b2f5246-4971-4502-a370-da270e3fd3dc/kube-rbac-proxy/0.log" Apr 16 19:14:15.063955 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.063928 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9pxxv_9a432113-3c33-4a2e-970d-baa3beba7cc7/dns-node-resolver/0.log" Apr 16 19:14:15.507251 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.507223 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6f5899868d-w82sl_6b8d63a6-6b9b-479e-abd2-e27c25f678c7/registry/0.log" Apr 16 19:14:15.547806 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.547779 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5zgqq_797c1b6c-d66b-4ed4-9555-07a34d9d2f2a/node-ca/0.log" Apr 16 19:14:15.575751 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.575720 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" event={"ID":"04d154f6-0c2d-40a4-bf95-6115da4369ff","Type":"ContainerStarted","Data":"82031c9080ac3750f2e6d246172fb26919fff24b5c43975eacc54db72289e26b"} Apr 16 19:14:15.575751 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.575753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" event={"ID":"04d154f6-0c2d-40a4-bf95-6115da4369ff","Type":"ContainerStarted","Data":"0e7146e103232747837d0dd83c6827ca27c2eab637687661317a3cba58ffd620"} Apr 16 19:14:15.575975 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.575782 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:15.593199 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:15.593144 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" podStartSLOduration=1.593130066 podStartE2EDuration="1.593130066s" podCreationTimestamp="2026-04-16 19:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:14:15.59149319 +0000 UTC m=+3816.975156672" watchObservedRunningTime="2026-04-16 19:14:15.593130066 +0000 UTC m=+3816.976793577" Apr 16 19:14:16.219053 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:16.219022 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c668c7d56-vrfwm_6347d10a-8636-4345-ac41-33e915aa23d9/router/0.log" Apr 16 19:14:16.557738 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:16.557665 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dmjvg_199e8adf-4b79-4ccd-a556-c3d828499a76/serve-healthcheck-canary/0.log" Apr 16 19:14:16.923606 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:16.923533 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z8gk_d15436f7-1a2d-4358-bca3-f550e2f4f3ed/kube-rbac-proxy/0.log" Apr 16 19:14:16.943710 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:16.943683 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z8gk_d15436f7-1a2d-4358-bca3-f550e2f4f3ed/exporter/0.log" Apr 16 19:14:16.965592 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:16.965569 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2z8gk_d15436f7-1a2d-4358-bca3-f550e2f4f3ed/extractor/0.log" Apr 16 19:14:19.100555 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:19.100522 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-65589c6846-m7fxb_ad965b4d-28c7-48d3-a278-b1f64b8c8284/manager/0.log" Apr 16 19:14:19.140572 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:19.140546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-97659_e4980a96-da9b-4070-92aa-f39af7bedab9/server/0.log" Apr 16 19:14:19.352663 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:19.352560 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gmrrc_b6487db1-ad50-4523-844f-e7d632a41af0/manager/0.log" Apr 16 19:14:19.493102 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:19.493070 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-sf72x_6ddb2428-0e30-46e2-a7cb-52b3ad1ee367/seaweedfs/0.log" Apr 16 19:14:19.514368 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:19.514345 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-wcb2t_bb096826-14ff-440f-837d-056444717b80/seaweedfs-tls-custom/0.log" Apr 16 19:14:21.587956 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:21.587928 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-27brw" Apr 16 19:14:23.608636 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:23.608598 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-6xrcn_2c4b6825-b8eb-410a-94cd-b638a19c37c4/kube-storage-version-migrator-operator/1.log" Apr 16 19:14:23.610301 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:23.610276 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-6xrcn_2c4b6825-b8eb-410a-94cd-b638a19c37c4/kube-storage-version-migrator-operator/0.log" Apr 16 19:14:24.857431 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.857404 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/kube-multus-additional-cni-plugins/0.log" Apr 16 19:14:24.878822 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.878792 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/egress-router-binary-copy/0.log" Apr 16 19:14:24.899097 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.899072 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/cni-plugins/0.log" Apr 16 19:14:24.925310 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.925286 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/bond-cni-plugin/0.log" Apr 16 19:14:24.945065 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.945020 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/routeoverride-cni/0.log" Apr 16 19:14:24.971233 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.971213 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/whereabouts-cni-bincopy/0.log" Apr 16 19:14:24.995708 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:24.995687 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5wk_0189e11b-ff99-45bd-a5b2-5f0873332309/whereabouts-cni/0.log" Apr 16 19:14:25.054001 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:25.053974 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nth7f_a8a6a8c0-5642-4be5-90d3-2827312267c3/kube-multus/0.log" Apr 16 19:14:25.150460 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:25.150368 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8w622_059c23ce-c3ea-4b83-a8a4-3c537435306e/network-metrics-daemon/0.log" Apr 16 19:14:25.171077 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:25.171026 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8w622_059c23ce-c3ea-4b83-a8a4-3c537435306e/kube-rbac-proxy/0.log" Apr 16 19:14:26.223887 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.223849 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/ovn-controller/0.log" Apr 16 19:14:26.258017 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.257988 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/ovn-acl-logging/0.log" Apr 16 19:14:26.275760 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.275729 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/kube-rbac-proxy-node/0.log" Apr 16 19:14:26.294739 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.294715 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:14:26.312300 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.312280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/northd/0.log" Apr 16 19:14:26.330804 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.330780 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/nbdb/0.log" Apr 16 19:14:26.352451 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.352432 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/sbdb/0.log" Apr 16 19:14:26.460810 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:26.460786 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fk24r_b488ff1d-ff75-45f2-8473-6f48445e1b55/ovnkube-controller/0.log" Apr 16 19:14:27.735554 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:27.735526 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9tn4q_30b4fba5-7d19-4433-b373-76fe14544828/network-check-target-container/0.log" Apr 16 19:14:28.641373 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:28.641344 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l8nt5_185c47d9-cd61-422a-b3f0-0a6dd4148756/iptables-alerter/0.log" Apr 16 19:14:29.245545 ip-10-0-141-192 kubenswrapper[2567]: I0416 19:14:29.245509 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2p2bw_7da2a734-1f23-41a0-a455-b5e1b7871e27/tuned/0.log"