Apr 22 19:23:47.223434 ip-10-0-140-242 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:47.716638 ip-10-0-140-242 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:47.716638 ip-10-0-140-242 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:47.716638 ip-10-0-140-242 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:47.716638 ip-10-0-140-242 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:47.716638 ip-10-0-140-242 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:47.718297 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.718187 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:47.723791 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723632 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:47.723791 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723793 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723798 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723801 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723804 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723807 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723810 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723813 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723816 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723819 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723822 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723824 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723827 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723830 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723833 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723836 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723844 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723847 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723849 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723852 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723855 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:47.723868 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723857 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723862 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723865 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723868 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723871 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723875 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723878 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723882 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723885 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723887 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723890 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723893 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723897 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723899 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723902 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723905 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723907 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723910 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723912 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723915 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723918 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:47.724497 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723921 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723923 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723926 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723929 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723931 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723936 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723940 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723944 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723947 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723949 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723952 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723954 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723957 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723960 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723963 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723968 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723971 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723974 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723977 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723979 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:47.725017 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723982 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723985 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723987 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723990 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723993 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.723998 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724001 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724005 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724008 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724011 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724014 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724016 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724019 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724022 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724025 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724027 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724030 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724033 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724035 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:47.725526 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724040 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724043 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724046 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724048 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724051 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724492 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724499 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724504 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724508 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724512 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724515 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724518 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724521 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724524 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724527 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724530 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724533 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724536 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724538 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:47.725990 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724541 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724544 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724546 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724549 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724552 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724554 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724557 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724560 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724562 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724565 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724567 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724570 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724573 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724576 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724579 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724581 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724584 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724587 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724589 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724592 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:47.726479 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724595 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724598 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724600 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724603 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724606 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724609 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724612 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724615 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724618 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724620 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724623 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724625 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724628 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724631 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724633 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724635 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724639 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724641 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724643 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724646 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:47.727005 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724649 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724651 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724654 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724656 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724659 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724661 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724664 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724667 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724669 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724672 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724674 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724678 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724681 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724684 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724686 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724689 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724692 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724695 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724698 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724700 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:47.727516 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724702 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724706 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724710 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724713 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724716 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724718 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724721 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724724 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724726 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724729 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724731 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.724734 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724812 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724819 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724827 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724832 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724836 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724840 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724844 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724849 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724853 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:47.728020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724857 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724860 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724864 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724868 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724871 2569 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724874 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724876 2569 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724880 2569 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724882 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724885 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724890 2569 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724894 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724897 2569 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724900 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724903 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724907 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724911 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724914 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724917 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724920 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724923 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724926 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724929 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724933 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724937 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:47.728547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724940 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724944 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724946 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724949 2569 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724953 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724958 2569 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724961 2569 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724965 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724968 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724971 2569 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724975 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724978 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724982 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724985 2569 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724988 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724991 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724994 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.724997 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725000 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725003 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725006 2569 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725010 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725013 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725016 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725020 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725023 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:47.729170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725026 2569 flags.go:64] FLAG: --help="false" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725029 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-140-242.ec2.internal" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725033 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725036 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725039 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725042 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725045 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725049 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725052 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725054 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725058 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725061 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725064 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725068 2569 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725071 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725074 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725077 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725080 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725083 2569 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725086 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725089 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725109 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725116 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725119 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:47.729786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725122 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725125 2569 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725128 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725131 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725135 2569 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725138 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725143 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725147 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725151 2569 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725155 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725158 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725161 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725164 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725167 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725170 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725173 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725181 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725184 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725187 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725190 2569 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725193 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725198 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725201 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725205 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:47.730375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725208 2569 flags.go:64] FLAG: --port="10250" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725211 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725214 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07066b3970ad2f212" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725217 2569 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725220 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725223 2569 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725227 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725230 2569 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725234 2569 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725237 2569 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725239 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725242 2569 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725246 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725249 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725252 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725255 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725258 2569 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725261 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725265 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725268 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725271 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725274 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725278 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725281 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725284 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725286 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:47.730953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725289 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725292 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725295 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725298 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725301 2569 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725304 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725310 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725313 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725316 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725320 2569 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725323 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725326 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725329 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725332 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725342 2569 flags.go:64] FLAG: --v="2" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725346 2569 flags.go:64] FLAG: --version="false" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725350 2569 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725355 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.725358 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725488 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725493 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725497 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725501 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725504 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:47.731585 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725508 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725511 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725514 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725516 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725520 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725524 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725527 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725530 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725533 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725536 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725539 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725542 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725545 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725548 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725551 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725555 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725558 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725561 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725563 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725566 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:47.732245 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725569 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725572 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725575 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725578 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725580 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725583 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725585 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725588 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725591 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725593 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725596 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725598 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725602 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725605 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725608 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725610 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725613 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725615 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725618 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:47.732760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725621 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725623 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725626 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725629 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725631 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725634 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725637 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725640 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725642 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725647 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725649 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725652 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725655 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725657 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725660 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725664 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725666 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725669 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725671 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725674 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:47.733488 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725677 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725679 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725682 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725684 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725687 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725691 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725693 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725696 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725699 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725701 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725704 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725707 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725710 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725712 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725715 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725718 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725720 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725723 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725726 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725729 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:47.734211 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725731 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:47.734711 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.725734 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:47.734711 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.726959 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:47.735659 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.735638 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:47.735697 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.735662 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:47.735727 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735715 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:47.735727 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735721 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:47.735727 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735725 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:47.735727 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735728 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735731 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735734 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735737 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735739 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735742 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735745 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735749 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735754 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735757 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735761 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735764 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735767 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735770 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735774 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735776 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735779 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735782 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735785 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735787 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:47.735832 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735790 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735793 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735795 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735798 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735801 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735803 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735806 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735808 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735812 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735814 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735817 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735820 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735822 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735825 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735829 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735831 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735834 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735838 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735841 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735844 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:47.736342 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735846 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735849 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735852 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735855 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735857 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735860 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735863 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735866 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735869 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735871 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735874 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735876 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735879 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735882 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735884 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735887 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735890 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735892 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735895 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735898 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:47.736825 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735901 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735903 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735906 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735908 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735911 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735914 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735917 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735919 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735922 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735924 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735928 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735933 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735936 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735939 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735942 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735944 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735947 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735950 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735952 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:47.737367 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735955 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735958 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735961 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.735963 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.735969 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736088 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736108 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736111 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736114 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736117 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736120 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736123 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736126 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736128 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736132 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736134 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:47.737830 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736137 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736140 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736142 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736146 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736149 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736152 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736154 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736157 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736160 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736163 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736165 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736169 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736172 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736174 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736177 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736179 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736182 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736184 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736187 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:47.738240 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736189 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736193 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736196 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736199 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736201 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736204 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736206 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736209 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736211 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736214 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736216 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736219 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736222 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736224 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736227 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736229 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736232 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736235 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736238 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736240 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:47.738726 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736243 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736245 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736248 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736251 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736253 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736256 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736259 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736261 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736264 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736267 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736269 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736272 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736274 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736277 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736279 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736282 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736285 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736287 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736290 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736293 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:47.739331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736295 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736298 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736300 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736303 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736307 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736311 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736314 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736316 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736320 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736324 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736328 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736331 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736334 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736337 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736339 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:47.739828 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:47.736342 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:47.740221 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.736347 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:47.740221 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.737212 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:47.740221 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.740006 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:47.741148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.741135 2569 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:47.741263 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.741243 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:47.741302 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.741292 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:47.769910 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.769885 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:47.774821 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.774799 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:47.788470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.788444 2569 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:47.794177 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.794157 2569 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:47.795386 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.795358 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:47.799970 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.799914 2569 fs.go:135] Filesystem UUIDs: map[17a112c5-7821-4608-919c-c13a7f64ff15:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9ab0c0e5-6bb7-4281-83c0-c905c6bdb3b9:/dev/nvme0n1p4] Apr 22 19:23:47.800077 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.799969 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:47.802441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.802417 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:47.807047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.806930 2569 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:47.804670059 +0000 UTC m=+0.452684949 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095107 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ec3f69263068d71b4f24ab3e0ec73 SystemUUID:ec2ec3f6-9263-068d-71b4-f24ab3e0ec73 BootID:8fc623a1-2521-44c1-b723-c45e678cf5a4 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b9:a7:62:6a:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b9:a7:62:6a:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:c9:2d:44:36:a6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:47.807503 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.807491 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:47.807654 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.807641 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:47.808721 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.808696 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:47.808897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.808723 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-242.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:47.808940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.808906 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:47.808940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.808915 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:47.808940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.808928 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:47.809862 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.809850 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:47.811232 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.811221 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:47.811341 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.811332 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:47.813809 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.813799 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:47.813841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.813813 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:47.813841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.813825 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:47.813841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.813835 2569 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:47.813928 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.813843 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:47.815301 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.815289 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:47.815342 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.815308 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:47.818911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.818895 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:47.820738 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.820725 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:47.822176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822162 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822180 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822187 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822194 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822199 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822205 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822211 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822217 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822225 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:47.822230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822232 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:47.822499 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822254 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:47.822499 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.822264 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:47.823819 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.823809 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:47.823859 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.823821 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:47.827317 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.827304 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:47.827375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.827341 2569 server.go:1295] "Started kubelet" Apr 22 19:23:47.827492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.827446 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:47.827544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.827449 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:47.827593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.827545 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:47.828277 ip-10-0-140-242 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:47.829334 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.829312 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:47.829979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.829960 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:47.830988 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.830966 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pdc8s" Apr 22 19:23:47.832627 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.832592 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-242.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:47.832700 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.832664 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:47.832700 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.832661 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-242.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:47.834416 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.834393 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:47.834936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.834913 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:47.835756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.835734 2569 factory.go:55] Registering systemd factory Apr 22 19:23:47.835833 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.835815 2569 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:47.835897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.835883 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:47.835983 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.835883 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:47.836060 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.835994 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:47.836126 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836116 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:47.836182 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836127 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:47.836182 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836160 2569 factory.go:153] Registering CRI-O factory Apr 22 19:23:47.836182 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836174 2569 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:47.836303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836224 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:47.836303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836252 2569 factory.go:103] Registering Raw factory Apr 22 19:23:47.836303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836268 2569 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:47.836432 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.836346 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:47.836758 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.836742 2569 manager.go:319] Starting recovery of all containers Apr 22 19:23:47.837922 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.837866 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:47.838256 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.838228 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-242.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:47.838570 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.838540 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:47.839873 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.838596 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-242.ec2.internal.18a8c43efe12bea4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-242.ec2.internal,UID:ip-10-0-140-242.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-242.ec2.internal,},FirstTimestamp:2026-04-22 19:23:47.827318436 +0000 UTC m=+0.475333234,LastTimestamp:2026-04-22 19:23:47.827318436 +0000 UTC m=+0.475333234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-242.ec2.internal,}" Apr 22 19:23:47.843588 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.843563 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pdc8s" Apr 22 19:23:47.848860 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.848835 2569 manager.go:324] Recovery completed Apr 22 19:23:47.849011 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.848982 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:47.854362 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.854348 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:47.857348 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857333 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:47.857414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857363 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:47.857414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857376 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:47.857884 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857871 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:47.857884 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857883 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:47.857988 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.857899 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:47.859883 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.859870 2569 policy_none.go:49] "None policy: Start" Apr 22 19:23:47.859939 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.859889 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:47.859939 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.859900 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:47.862769 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.862693 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-242.ec2.internal.18a8c43effdcf172 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-242.ec2.internal,UID:ip-10-0-140-242.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-242.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-242.ec2.internal,},FirstTimestamp:2026-04-22 19:23:47.85734693 +0000 UTC m=+0.505361725,LastTimestamp:2026-04-22 19:23:47.85734693 +0000 UTC m=+0.505361725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-242.ec2.internal,}" Apr 22 19:23:47.892909 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.892887 2569 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.892968 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.892983 2569 server.go:85] "Starting device plugin registration server" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.893286 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.893298 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.893376 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.893461 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.893474 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.894375 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:47.907637 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.894414 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:47.946313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.946288 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:47.946434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.946321 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:47.946434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.946340 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:47.946434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.946348 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:47.946434 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:47.946424 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:47.953791 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.953766 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:47.994065 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.993993 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:47.995054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.995035 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:47.995178 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.995080 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:47.995178 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.995114 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:47.995178 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:47.995157 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.004419 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.004394 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.004502 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.004422 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-242.ec2.internal\": node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.021654 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.021620 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.046549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.046517 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal"] Apr 22 19:23:48.046647 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.046608 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:48.047594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.047575 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:48.047702 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.047612 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:48.047702 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.047630 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:48.048886 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.048870 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:48.049088 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.049160 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049120 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:48.049624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049608 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:48.049686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049650 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:48.049686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049658 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:48.049686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049680 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:48.049686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049684 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:48.049829 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.049699 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:48.050724 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.050708 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.050823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.050743 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:48.051430 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.051403 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:48.051430 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.051430 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:48.051560 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.051444 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:48.084915 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.084891 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-242.ec2.internal\" not found" node="ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.089438 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.089420 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-242.ec2.internal\" not found" node="ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.121907 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.121878 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.137494 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.137466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a6923805a814b270020f7b819e6da6c2-config\") pod \"kube-apiserver-proxy-ip-10-0-140-242.ec2.internal\" (UID: \"a6923805a814b270020f7b819e6da6c2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.222891 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.222858 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.239599 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.239572 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a6923805a814b270020f7b819e6da6c2-config\") pod \"kube-apiserver-proxy-ip-10-0-140-242.ec2.internal\" (UID: \"a6923805a814b270020f7b819e6da6c2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.239652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.239608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.239652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.239628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.239716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.239677 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a6923805a814b270020f7b819e6da6c2-config\") pod \"kube-apiserver-proxy-ip-10-0-140-242.ec2.internal\" (UID: \"a6923805a814b270020f7b819e6da6c2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.323733 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.323659 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.340132 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.340088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.340185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.340137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.340185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.340179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.340248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.340211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ec5e763de14cf3a8a316ab3ccb4124b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal\" (UID: \"7ec5e763de14cf3a8a316ab3ccb4124b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.387294 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.387254 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.391896 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.391870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.424444 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.424411 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.525024 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.524985 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.625566 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.625490 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.725833 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.725802 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-242.ec2.internal\" not found" Apr 22 19:23:48.730794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.730760 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:48.732747 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.732722 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:48.736042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.736023 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.741150 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.741132 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:48.741290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.741273 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:48.741351 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.741324 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:48.741351 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.741325 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:48.741351 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.741326 2569 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://abc8c8fd2417744b69ee06f556a96d9a-11974c5adb8765c9.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.140.242:33214->54.163.97.203:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.741454 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.741359 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" Apr 22 19:23:48.759362 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.759340 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:48.814739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.814712 2569 apiserver.go:52] "Watching apiserver" Apr 22 19:23:48.821416 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.821388 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:48.821763 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.821743 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6b4cg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal","openshift-multus/multus-additional-cni-plugins-x9sbh","openshift-multus/multus-h476q","openshift-multus/network-metrics-daemon-dx52z","openshift-network-diagnostics/network-check-target-mjd2c","kube-system/konnectivity-agent-pjlfc","kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal","openshift-cluster-node-tuning-operator/tuned-j7nxx","openshift-network-operator/iptables-alerter-qnv98","openshift-ovn-kubernetes/ovnkube-node-99kk8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv"] Apr 22 19:23:48.823337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.823176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.825349 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.825322 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.825948 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.825927 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.826038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.825931 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:48.826038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.825959 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.826038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.826034 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-v5nhj\"" Apr 22 19:23:48.826313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.826290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h476q" Apr 22 19:23:48.827238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.827219 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.827381 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.827270 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:48.827470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.827456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.827847 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.827878 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.827887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828085 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828161 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wbbcm\"" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.828439 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:48.828482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828459 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.828909 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828568 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:48.828909 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.828717 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vqgbg\"" Apr 22 19:23:48.829516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.829500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.830651 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.830632 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.830747 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.830707 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wm46v\"" Apr 22 19:23:48.831074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831056 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:48.831171 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831139 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:48.831655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831636 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.831746 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831668 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.831905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831892 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9b8wr\"" Apr 22 19:23:48.831964 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.831934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.832931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.832910 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.832931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.832919 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.833075 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.832919 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xnlqg\"" Apr 22 19:23:48.833311 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.833080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.833311 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.833231 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:48.835427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.834778 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:48.835427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.834995 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:48.835427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.835013 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.835427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.835090 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:48.835427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.835289 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:48.835690 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.835626 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.836393 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:48.836393 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836357 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-97wkg\"" Apr 22 19:23:48.836531 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836461 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:48.836531 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836321 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l955b\"" Apr 22 19:23:48.836705 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836690 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:48.836899 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.836887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:48.837337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.837319 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:48.842545 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-k8s-cni-cncf-io\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.842622 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-daemon-config\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.842622 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:48.842622 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-sys\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.842713 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-host\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.842713 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-config\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.842773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.842773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-kubelet\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.842773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-kubernetes\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.842874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842784 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-systemd-units\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.842874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.842874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-device-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.842874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-netns\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842874 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-run\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842927 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-script-lib\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2drq\" (UniqueName: \"kubernetes.io/projected/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kube-api-access-z2drq\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.842979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-multus\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-netns\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-ovn\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843075 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-netd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-var-lib-kubelet\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319d7fc4-bd09-4f40-bc9c-908e50f344ed-host-slash\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-etc-selinux\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-os-release\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-hostroot\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-slash\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843291 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81956a4f-380b-43d9-919e-60fbb787f267-agent-certs\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81956a4f-380b-43d9-919e-60fbb787f267-konnectivity-ca\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319d7fc4-bd09-4f40-bc9c-908e50f344ed-iptables-alerter-script\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843361 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-sys-fs\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843377 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-tmp\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-log-socket\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-os-release\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.843431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843435 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-socket-dir-parent\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twcsr\" (UniqueName: \"kubernetes.io/projected/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-kube-api-access-twcsr\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cnibin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cni-binary-copy\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-lib-modules\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-systemd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/94c9353c-64db-4c45-9df3-30ea8b6efb63-kube-api-access-ppphw\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-conf-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-multus-certs\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843650 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843665 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-systemd\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-var-lib-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-registration-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-serviceca\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr96q\" (UniqueName: \"kubernetes.io/projected/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-kube-api-access-vr96q\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.844165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-etc-kubernetes\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmfwj\" (UniqueName: \"kubernetes.io/projected/f4583537-f5a4-4201-a5ba-5c41cf04b3da-kube-api-access-jmfwj\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysconfig\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-bin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-etc-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-bin\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843950 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-env-overrides\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.843989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-node-log\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrzvd\" (UniqueName: \"kubernetes.io/projected/319d7fc4-bd09-4f40-bc9c-908e50f344ed-kube-api-access-zrzvd\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-system-cni-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-system-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovn-node-metrics-cert\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-socket-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.845042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-kube-api-access-62kkf\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-conf\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvf4\" (UniqueName: \"kubernetes.io/projected/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-kube-api-access-lmvf4\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-host\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cnibin\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-modprobe-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-tuned\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.844245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-kubelet\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.845306 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:47 +0000 UTC" deadline="2027-09-22 23:33:57.188370166 +0000 UTC" Apr 22 19:23:48.845575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.845346 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12436h10m8.3430271s" Apr 22 19:23:48.851806 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.851783 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:48.874300 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.874264 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-q87mr" Apr 22 19:23:48.882605 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.882530 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-q87mr" Apr 22 19:23:48.920600 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:48.920557 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6923805a814b270020f7b819e6da6c2.slice/crio-8cc086f18e5b6531789a00a98c65cd035b34cead008ccab8a7906ab3323d53fe WatchSource:0}: Error finding container 8cc086f18e5b6531789a00a98c65cd035b34cead008ccab8a7906ab3323d53fe: Status 404 returned error can't find the container with id 8cc086f18e5b6531789a00a98c65cd035b34cead008ccab8a7906ab3323d53fe Apr 22 19:23:48.920813 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:48.920794 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec5e763de14cf3a8a316ab3ccb4124b.slice/crio-658703e1d745ef5e18cbbf761d90fd9dbbd567b0f4e41b0680553fddfcb037fa WatchSource:0}: Error finding container 658703e1d745ef5e18cbbf761d90fd9dbbd567b0f4e41b0680553fddfcb037fa: Status 404 returned error can't find the container with id 658703e1d745ef5e18cbbf761d90fd9dbbd567b0f4e41b0680553fddfcb037fa Apr 22 19:23:48.926770 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.926752 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:48.944484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-os-release\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.944594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.944594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-socket-dir-parent\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twcsr\" (UniqueName: \"kubernetes.io/projected/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-kube-api-access-twcsr\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-socket-dir-parent\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-os-release\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.944694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cnibin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cni-binary-copy\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-lib-modules\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-systemd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944744 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cnibin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/94c9353c-64db-4c45-9df3-30ea8b6efb63-kube-api-access-ppphw\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-systemd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-conf-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-multus-certs\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-conf-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-multus-certs\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944866 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-lib-modules\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.944897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-systemd\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944916 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-systemd\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-var-lib-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.944941 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-registration-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-var-lib-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.944991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-serviceca\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945019 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-registration-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.945033 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.444984827 +0000 UTC m=+2.092999626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr96q\" (UniqueName: \"kubernetes.io/projected/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-kube-api-access-vr96q\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945079 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-etc-kubernetes\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmfwj\" (UniqueName: \"kubernetes.io/projected/f4583537-f5a4-4201-a5ba-5c41cf04b3da-kube-api-access-jmfwj\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysconfig\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-etc-kubernetes\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-bin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.945604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-cni-binary-copy\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-etc-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysconfig\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-bin\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945339 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-etc-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945372 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-bin\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-env-overrides\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-serviceca\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-bin\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-node-log\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrzvd\" (UniqueName: \"kubernetes.io/projected/319d7fc4-bd09-4f40-bc9c-908e50f344ed-kube-api-access-zrzvd\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-system-cni-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-node-log\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.946590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-system-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-system-cni-dir\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovn-node-metrics-cert\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-system-cni-dir\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-socket-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-kube-api-access-62kkf\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-conf\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvf4\" (UniqueName: \"kubernetes.io/projected/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-kube-api-access-lmvf4\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-host\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945873 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-socket-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cnibin\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cnibin\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-modprobe-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-env-overrides\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.947472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-tuned\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945974 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-sysctl-conf\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.945928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-host\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-kubelet\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-k8s-cni-cncf-io\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-daemon-config\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946071 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-modprobe-d\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-sys\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-host\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-k8s-cni-cncf-io\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-host\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-kubelet\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946232 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-sys\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-config\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946251 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946262 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-kubelet\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-kubernetes\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-systemd-units\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946401 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-device-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-netns\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-systemd-units\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-run\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946470 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-openvswitch\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-kubernetes\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-run-netns\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-script-lib\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-kubelet\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946541 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2drq\" (UniqueName: \"kubernetes.io/projected/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kube-api-access-z2drq\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-device-dir\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.948841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-multus\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-multus-daemon-config\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-netns\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-host-var-lib-cni-multus\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946675 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-ovn\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-netd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-run-netns\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-var-lib-kubelet\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946768 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-config\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-run\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946813 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319d7fc4-bd09-4f40-bc9c-908e50f344ed-host-slash\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-cni-netd\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-etc-selinux\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-var-lib-kubelet\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/319d7fc4-bd09-4f40-bc9c-908e50f344ed-host-slash\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-run-ovn\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-os-release\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.949548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-hostroot\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-etc-selinux\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.946985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-slash\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-os-release\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovnkube-script-lib\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-host-slash\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81956a4f-380b-43d9-919e-60fbb787f267-agent-certs\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947055 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-hostroot\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81956a4f-380b-43d9-919e-60fbb787f267-konnectivity-ca\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319d7fc4-bd09-4f40-bc9c-908e50f344ed-iptables-alerter-script\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947234 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-sys-fs\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-tmp\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be6357c3-5a5a-42fe-871f-451ef4ce5f52-sys-fs\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-log-socket\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/81956a4f-380b-43d9-919e-60fbb787f267-konnectivity-ca\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94c9353c-64db-4c45-9df3-30ea8b6efb63-log-socket\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.947801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/319d7fc4-bd09-4f40-bc9c-908e50f344ed-iptables-alerter-script\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.950401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.949822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" event={"ID":"a6923805a814b270020f7b819e6da6c2","Type":"ContainerStarted","Data":"8cc086f18e5b6531789a00a98c65cd035b34cead008ccab8a7906ab3323d53fe"} Apr 22 19:23:48.951045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.950551 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-etc-tuned\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.951045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.950577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-tmp\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.951045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.950664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94c9353c-64db-4c45-9df3-30ea8b6efb63-ovn-node-metrics-cert\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.951045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.950766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/81956a4f-380b-43d9-919e-60fbb787f267-agent-certs\") pod \"konnectivity-agent-pjlfc\" (UID: \"81956a4f-380b-43d9-919e-60fbb787f267\") " pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:48.951045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.951001 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" event={"ID":"7ec5e763de14cf3a8a316ab3ccb4124b","Type":"ContainerStarted","Data":"658703e1d745ef5e18cbbf761d90fd9dbbd567b0f4e41b0680553fddfcb037fa"} Apr 22 19:23:48.953658 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.953312 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:48.953658 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.953334 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:48.953658 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.953348 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.953658 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:48.953469 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.453448215 +0000 UTC m=+2.101463010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.953954 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.953926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcsr\" (UniqueName: \"kubernetes.io/projected/a41b4bf8-7bc3-4be1-bb23-1c56997325bd-kube-api-access-twcsr\") pod \"multus-h476q\" (UID: \"a41b4bf8-7bc3-4be1-bb23-1c56997325bd\") " pod="openshift-multus/multus-h476q" Apr 22 19:23:48.955663 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.955611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmfwj\" (UniqueName: \"kubernetes.io/projected/f4583537-f5a4-4201-a5ba-5c41cf04b3da-kube-api-access-jmfwj\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:48.955754 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.955686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/94c9353c-64db-4c45-9df3-30ea8b6efb63-kube-api-access-ppphw\") pod \"ovnkube-node-99kk8\" (UID: \"94c9353c-64db-4c45-9df3-30ea8b6efb63\") " pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:48.955936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.955910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvf4\" (UniqueName: \"kubernetes.io/projected/8a3f561c-50dc-4fa1-a16d-ff98f159b8e8-kube-api-access-lmvf4\") pod \"tuned-j7nxx\" (UID: \"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8\") " pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:48.956041 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.955963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr96q\" (UniqueName: \"kubernetes.io/projected/3dc70558-ecae-4e50-82a2-3b1c70e5cfb2-kube-api-access-vr96q\") pod \"node-ca-6b4cg\" (UID: \"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2\") " pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:48.956136 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.956069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrzvd\" (UniqueName: \"kubernetes.io/projected/319d7fc4-bd09-4f40-bc9c-908e50f344ed-kube-api-access-zrzvd\") pod \"iptables-alerter-qnv98\" (UID: \"319d7fc4-bd09-4f40-bc9c-908e50f344ed\") " pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:48.956191 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.956162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2drq\" (UniqueName: \"kubernetes.io/projected/be6357c3-5a5a-42fe-871f-451ef4ce5f52-kube-api-access-z2drq\") pod \"aws-ebs-csi-driver-node-nqzsv\" (UID: \"be6357c3-5a5a-42fe-871f-451ef4ce5f52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:48.956438 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:48.956420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/bf2d6d69-a4bd-4d9a-b48c-1f85a054c228-kube-api-access-62kkf\") pod \"multus-additional-cni-plugins-x9sbh\" (UID: \"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228\") " pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:49.153281 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.153172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6b4cg" Apr 22 19:23:49.159951 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.159915 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" Apr 22 19:23:49.160421 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.160396 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc70558_ecae_4e50_82a2_3b1c70e5cfb2.slice/crio-510a1457057a2022fb950ad9b615f451613d408b71049cc76b36f0eb0f5e23c7 WatchSource:0}: Error finding container 510a1457057a2022fb950ad9b615f451613d408b71049cc76b36f0eb0f5e23c7: Status 404 returned error can't find the container with id 510a1457057a2022fb950ad9b615f451613d408b71049cc76b36f0eb0f5e23c7 Apr 22 19:23:49.166426 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.166404 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2d6d69_a4bd_4d9a_b48c_1f85a054c228.slice/crio-c2b3f7945f61314ad9ddaa428f9ea2fc3e396b900b5ca54cc54825f3a429605c WatchSource:0}: Error finding container c2b3f7945f61314ad9ddaa428f9ea2fc3e396b900b5ca54cc54825f3a429605c: Status 404 returned error can't find the container with id c2b3f7945f61314ad9ddaa428f9ea2fc3e396b900b5ca54cc54825f3a429605c Apr 22 19:23:49.181839 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.181809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h476q" Apr 22 19:23:49.187850 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.187824 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda41b4bf8_7bc3_4be1_bb23_1c56997325bd.slice/crio-cfaec9308250b76cb699c9c9368cbd0e7760448a4def730f26c7b31e21526a9f WatchSource:0}: Error finding container cfaec9308250b76cb699c9c9368cbd0e7760448a4def730f26c7b31e21526a9f: Status 404 returned error can't find the container with id cfaec9308250b76cb699c9c9368cbd0e7760448a4def730f26c7b31e21526a9f Apr 22 19:23:49.195969 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.195947 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:23:49.202250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.202228 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" Apr 22 19:23:49.203558 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.203540 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81956a4f_380b_43d9_919e_60fbb787f267.slice/crio-6c1f008b3d5f76428d0c556a4d8a8ce25318e002c08a7f034d19eac5474993be WatchSource:0}: Error finding container 6c1f008b3d5f76428d0c556a4d8a8ce25318e002c08a7f034d19eac5474993be: Status 404 returned error can't find the container with id 6c1f008b3d5f76428d0c556a4d8a8ce25318e002c08a7f034d19eac5474993be Apr 22 19:23:49.208760 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.208731 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3f561c_50dc_4fa1_a16d_ff98f159b8e8.slice/crio-cc2938c1c726744a6157744065559751760d11586bfc097d00411ebef7040c37 WatchSource:0}: Error finding container cc2938c1c726744a6157744065559751760d11586bfc097d00411ebef7040c37: Status 404 returned error can't find the container with id cc2938c1c726744a6157744065559751760d11586bfc097d00411ebef7040c37 Apr 22 19:23:49.231873 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.231827 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qnv98" Apr 22 19:23:49.237912 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.237881 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:23:49.238074 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.238053 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319d7fc4_bd09_4f40_bc9c_908e50f344ed.slice/crio-98bda01e0619c2f7f508d40dd91251466d6057008be3e539f57eff758d606278 WatchSource:0}: Error finding container 98bda01e0619c2f7f508d40dd91251466d6057008be3e539f57eff758d606278: Status 404 returned error can't find the container with id 98bda01e0619c2f7f508d40dd91251466d6057008be3e539f57eff758d606278 Apr 22 19:23:49.240532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.240469 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" Apr 22 19:23:49.244547 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.244522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c9353c_64db_4c45_9df3_30ea8b6efb63.slice/crio-9ae687082c019c42bf2bb9584c0a3c6ad57765cf31196a2520c9e94ac899c304 WatchSource:0}: Error finding container 9ae687082c019c42bf2bb9584c0a3c6ad57765cf31196a2520c9e94ac899c304: Status 404 returned error can't find the container with id 9ae687082c019c42bf2bb9584c0a3c6ad57765cf31196a2520c9e94ac899c304 Apr 22 19:23:49.247636 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:23:49.247611 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe6357c3_5a5a_42fe_871f_451ef4ce5f52.slice/crio-2bd9748f474f105a6c66507a7e6231d26e298d177f1de59c486ceb5758084293 WatchSource:0}: Error finding container 2bd9748f474f105a6c66507a7e6231d26e298d177f1de59c486ceb5758084293: Status 404 returned error can't find the container with id 2bd9748f474f105a6c66507a7e6231d26e298d177f1de59c486ceb5758084293 Apr 22 19:23:49.328789 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.328750 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:49.450872 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.450765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:49.451042 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.450915 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:49.451042 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.450981 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:23:50.450962859 +0000 UTC m=+3.098977660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:49.551627 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.551585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:49.551861 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.551765 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:49.551861 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.551784 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:49.551861 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.551795 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:49.551861 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.551860 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:50.551846793 +0000 UTC m=+3.199861575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:49.883911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.883799 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:48 +0000 UTC" deadline="2027-12-22 23:03:18.460305633 +0000 UTC" Apr 22 19:23:49.883911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.883854 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14619h39m28.576459384s" Apr 22 19:23:49.949821 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.949785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:49.950019 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:49.949992 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:49.969979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.969939 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pjlfc" event={"ID":"81956a4f-380b-43d9-919e-60fbb787f267","Type":"ContainerStarted","Data":"6c1f008b3d5f76428d0c556a4d8a8ce25318e002c08a7f034d19eac5474993be"} Apr 22 19:23:49.981722 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.981595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h476q" event={"ID":"a41b4bf8-7bc3-4be1-bb23-1c56997325bd","Type":"ContainerStarted","Data":"cfaec9308250b76cb699c9c9368cbd0e7760448a4def730f26c7b31e21526a9f"} Apr 22 19:23:49.984049 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.984021 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:49.985904 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.985813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerStarted","Data":"c2b3f7945f61314ad9ddaa428f9ea2fc3e396b900b5ca54cc54825f3a429605c"} Apr 22 19:23:49.995404 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:49.995341 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" event={"ID":"be6357c3-5a5a-42fe-871f-451ef4ce5f52","Type":"ContainerStarted","Data":"2bd9748f474f105a6c66507a7e6231d26e298d177f1de59c486ceb5758084293"} Apr 22 19:23:50.003964 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.003927 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"9ae687082c019c42bf2bb9584c0a3c6ad57765cf31196a2520c9e94ac899c304"} Apr 22 19:23:50.016628 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.016594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qnv98" event={"ID":"319d7fc4-bd09-4f40-bc9c-908e50f344ed","Type":"ContainerStarted","Data":"98bda01e0619c2f7f508d40dd91251466d6057008be3e539f57eff758d606278"} Apr 22 19:23:50.025789 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.025749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" event={"ID":"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8","Type":"ContainerStarted","Data":"cc2938c1c726744a6157744065559751760d11586bfc097d00411ebef7040c37"} Apr 22 19:23:50.035115 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.035032 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6b4cg" event={"ID":"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2","Type":"ContainerStarted","Data":"510a1457057a2022fb950ad9b615f451613d408b71049cc76b36f0eb0f5e23c7"} Apr 22 19:23:50.460801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.460755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:50.460987 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.460942 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:50.461050 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.461011 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.460991552 +0000 UTC m=+5.109006356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:50.561767 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.561724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:50.561962 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.561910 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:50.561962 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.561939 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:50.561962 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.561953 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:50.562132 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.562026 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.562003746 +0000 UTC m=+5.210018531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:50.671116 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.671069 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:50.754522 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.754442 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:50.884978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.884927 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:48 +0000 UTC" deadline="2027-12-23 08:25:26.941347575 +0000 UTC" Apr 22 19:23:50.884978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.884973 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14629h1m36.056378432s" Apr 22 19:23:50.947594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:50.947560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:50.947758 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:50.947720 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:51.948763 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:51.948268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:51.948763 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:51.948387 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:52.478576 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:52.477931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:52.478576 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.478139 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.478576 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.478206 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:23:56.478184791 +0000 UTC m=+9.126199579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.579077 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:52.579040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:52.579268 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.579233 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:52.579268 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.579255 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:52.579268 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.579268 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:52.579375 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.579329 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:56.579309981 +0000 UTC m=+9.227324785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:52.947726 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:52.947397 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:52.947726 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:52.947577 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:53.947231 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:53.947192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:53.947692 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:53.947326 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:54.947490 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:54.947172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:54.947490 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:54.947306 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:55.791468 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.790241 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9rtnc"] Apr 22 19:23:55.799555 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.798916 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:55.801916 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.801741 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:55.801916 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.801911 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g6d9q\"" Apr 22 19:23:55.803077 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.803054 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:55.907427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.907385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-hosts-file\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:55.907427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.907426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-tmp-dir\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:55.907668 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.907488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vq56\" (UniqueName: \"kubernetes.io/projected/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-kube-api-access-5vq56\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:55.950135 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:55.949630 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:55.950135 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:55.949748 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:56.007995 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.007955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vq56\" (UniqueName: \"kubernetes.io/projected/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-kube-api-access-5vq56\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.008166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.008031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-hosts-file\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.008166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.008060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-tmp-dir\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.008434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.008380 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-hosts-file\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.009058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.009012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-tmp-dir\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.024784 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.024752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vq56\" (UniqueName: \"kubernetes.io/projected/51e23bda-7f24-43f3-9b0b-9e0f8a95c02f-kube-api-access-5vq56\") pod \"node-resolver-9rtnc\" (UID: \"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f\") " pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.111237 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.111162 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9rtnc" Apr 22 19:23:56.512822 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.512735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:56.512986 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.512892 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:56.512986 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.512960 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.512943045 +0000 UTC m=+17.160957833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:56.613686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.613643 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:56.613866 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.613799 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:56.613866 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.613817 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:56.613866 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.613829 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:56.613986 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.613893 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.613866227 +0000 UTC m=+17.261881025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:56.946804 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:56.946730 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:56.946953 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:56.946873 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:57.947797 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:57.947754 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:57.948228 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:57.947851 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:23:58.947558 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:58.947513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:23:58.947731 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:58.947667 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:23:59.947073 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:23:59.947037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:23:59.947519 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:23:59.947162 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:00.947435 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:00.947399 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:00.947862 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:00.947545 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:01.947315 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:01.947277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:01.947507 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:01.947419 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:02.946998 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:02.946915 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:02.947162 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:02.947039 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:03.947115 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:03.947062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:03.947514 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:03.947193 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:04.569389 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:04.569352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:04.569604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.569507 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:04.569604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.569578 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.569559425 +0000 UTC m=+33.217574207 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:04.669881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:04.669839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:04.670059 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.669984 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:04.670059 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.670001 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:04.670059 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.670010 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:04.670210 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.670066 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.670047547 +0000 UTC m=+33.318062330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:04.947306 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:04.947233 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:04.947687 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:04.947374 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:05.947308 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:05.947265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:05.947790 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:05.947410 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:06.946978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:06.946941 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:06.947191 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:06.947083 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:07.388588 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:24:07.388557 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e23bda_7f24_43f3_9b0b_9e0f8a95c02f.slice/crio-fc9e13312f1eb399dc2aeb1a569fbced28270ed95ac4797d0aefd1b3541a2165 WatchSource:0}: Error finding container fc9e13312f1eb399dc2aeb1a569fbced28270ed95ac4797d0aefd1b3541a2165: Status 404 returned error can't find the container with id fc9e13312f1eb399dc2aeb1a569fbced28270ed95ac4797d0aefd1b3541a2165 Apr 22 19:24:07.947712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:07.947466 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:07.947870 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:07.947795 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:08.072404 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.071856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h476q" event={"ID":"a41b4bf8-7bc3-4be1-bb23-1c56997325bd","Type":"ContainerStarted","Data":"7eb7056b96c5aeb5227aeb5dd156490b410399b13341c71fe0717e3392f41110"} Apr 22 19:24:08.077847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.077825 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:24:08.078743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.078719 2569 generic.go:358] "Generic (PLEG): container finished" podID="94c9353c-64db-4c45-9df3-30ea8b6efb63" containerID="3f61b403ccecd428a6e234ce55de7c62af08870a32d7351455324f852922d856" exitCode=1 Apr 22 19:24:08.078881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.078863 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"33e58ce4c636c9b2548343ab48dbd265a8f2ba7bfd8e2e30e5c01ac1a318e789"} Apr 22 19:24:08.078949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.078892 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"181c2ad470373da695750c9473ffd65f43597a9acef03424e299897f2fc53ee1"} Apr 22 19:24:08.078949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.078906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerDied","Data":"3f61b403ccecd428a6e234ce55de7c62af08870a32d7351455324f852922d856"} Apr 22 19:24:08.078949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.078920 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"9354aabc47ee95d1f737ee17a2bf6d505e8bc3f0871a4a2b7a91f5cd589dd3e9"} Apr 22 19:24:08.080537 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.080481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" event={"ID":"8a3f561c-50dc-4fa1-a16d-ff98f159b8e8","Type":"ContainerStarted","Data":"917093697bd37da248c9c24f177ee1089e5bd75518d1d1138f390aa4dedb107f"} Apr 22 19:24:08.082045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.081987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9rtnc" event={"ID":"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f","Type":"ContainerStarted","Data":"fc9e13312f1eb399dc2aeb1a569fbced28270ed95ac4797d0aefd1b3541a2165"} Apr 22 19:24:08.083204 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.083181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" event={"ID":"a6923805a814b270020f7b819e6da6c2","Type":"ContainerStarted","Data":"dd16dcfea693b320bd9ada33819d3308cd9c67381067a0a4c212edad0c6f19ea"} Apr 22 19:24:08.107335 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.107263 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h476q" podStartSLOduration=1.5489774889999999 podStartE2EDuration="20.107243134s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.189544006 +0000 UTC m=+1.837558790" lastFinishedPulling="2026-04-22 19:24:07.74780964 +0000 UTC m=+20.395824435" observedRunningTime="2026-04-22 19:24:08.106270103 +0000 UTC m=+20.754284901" watchObservedRunningTime="2026-04-22 19:24:08.107243134 +0000 UTC m=+20.755257938" Apr 22 19:24:08.154813 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.154663 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-242.ec2.internal" podStartSLOduration=20.154647315 podStartE2EDuration="20.154647315s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:08.126879827 +0000 UTC m=+20.774894633" watchObservedRunningTime="2026-04-22 19:24:08.154647315 +0000 UTC m=+20.802662120" Apr 22 19:24:08.154813 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.154729 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j7nxx" podStartSLOduration=1.9812511659999998 podStartE2EDuration="20.154725693s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.210272065 +0000 UTC m=+1.858286847" lastFinishedPulling="2026-04-22 19:24:07.383746578 +0000 UTC m=+20.031761374" observedRunningTime="2026-04-22 19:24:08.154028403 +0000 UTC m=+20.802043217" watchObservedRunningTime="2026-04-22 19:24:08.154725693 +0000 UTC m=+20.802740496" Apr 22 19:24:08.947238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:08.947201 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:08.947999 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:08.947344 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:09.086949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.086911 2569 generic.go:358] "Generic (PLEG): container finished" podID="7ec5e763de14cf3a8a316ab3ccb4124b" containerID="2a027085575657bc898588b69a630410aca38755931ff14b115c76c2dc785caf" exitCode=0 Apr 22 19:24:09.087140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.087007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" event={"ID":"7ec5e763de14cf3a8a316ab3ccb4124b","Type":"ContainerDied","Data":"2a027085575657bc898588b69a630410aca38755931ff14b115c76c2dc785caf"} Apr 22 19:24:09.088506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.088478 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" event={"ID":"be6357c3-5a5a-42fe-871f-451ef4ce5f52","Type":"ContainerStarted","Data":"c551f4c477f7131db8f376df9b5eaadf3c4f4c53ac7c548dafb79e6829a49a4f"} Apr 22 19:24:09.091363 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.091334 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:24:09.091716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.091676 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"623b97ef55f744b4d383de4d74e14afe118e6f2dbc78f4148dff47a6b9948093"} Apr 22 19:24:09.091716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.091715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"dce0941253c3a9266230c72bec8d8debfe7d7c7dfabf1bd74cf999fe65722fb4"} Apr 22 19:24:09.093023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.092994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qnv98" event={"ID":"319d7fc4-bd09-4f40-bc9c-908e50f344ed","Type":"ContainerStarted","Data":"cb603312e52a5201a228eae2d64554d0c83e6c947b5a4a7c90d97bc5f9bbc853"} Apr 22 19:24:09.094375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.094348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6b4cg" event={"ID":"3dc70558-ecae-4e50-82a2-3b1c70e5cfb2","Type":"ContainerStarted","Data":"d13795d5ffb1d66974ea1b93df255fbfd3e642e9254f0081da883c7c00a6bc5b"} Apr 22 19:24:09.097649 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.097616 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9rtnc" event={"ID":"51e23bda-7f24-43f3-9b0b-9e0f8a95c02f","Type":"ContainerStarted","Data":"3b80370c53af7fd40d31aad905e4188d68219c09e06dae091faae7650ccc5ead"} Apr 22 19:24:09.099119 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.099078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pjlfc" event={"ID":"81956a4f-380b-43d9-919e-60fbb787f267","Type":"ContainerStarted","Data":"3d8775853cd444a5037da8d2fc689083e7c2c75f5686a1a47c6a914dd18dfbb1"} Apr 22 19:24:09.100515 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.100492 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="7038894bf0a46c6c2114dc6ade7e95d27cad6bf387b2920a6650bd2c611ae4f4" exitCode=0 Apr 22 19:24:09.100606 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.100593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"7038894bf0a46c6c2114dc6ade7e95d27cad6bf387b2920a6650bd2c611ae4f4"} Apr 22 19:24:09.119729 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.119628 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9rtnc" podStartSLOduration=14.119616005 podStartE2EDuration="14.119616005s" podCreationTimestamp="2026-04-22 19:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:09.119079571 +0000 UTC m=+21.767094375" watchObservedRunningTime="2026-04-22 19:24:09.119616005 +0000 UTC m=+21.767630808" Apr 22 19:24:09.166209 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.166160 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pjlfc" podStartSLOduration=2.98796092 podStartE2EDuration="21.166144317s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.205526649 +0000 UTC m=+1.853541430" lastFinishedPulling="2026-04-22 19:24:07.383710027 +0000 UTC m=+20.031724827" observedRunningTime="2026-04-22 19:24:09.166066845 +0000 UTC m=+21.814081650" watchObservedRunningTime="2026-04-22 19:24:09.166144317 +0000 UTC m=+21.814159121" Apr 22 19:24:09.182959 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.182880 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qnv98" podStartSLOduration=3.037405995 podStartE2EDuration="21.18286617s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.240850946 +0000 UTC m=+1.888865729" lastFinishedPulling="2026-04-22 19:24:07.386311104 +0000 UTC m=+20.034325904" observedRunningTime="2026-04-22 19:24:09.182337611 +0000 UTC m=+21.830352417" watchObservedRunningTime="2026-04-22 19:24:09.18286617 +0000 UTC m=+21.830880974" Apr 22 19:24:09.198020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.197902 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6b4cg" podStartSLOduration=3.0072076 podStartE2EDuration="21.197882077s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.162749329 +0000 UTC m=+1.810764111" lastFinishedPulling="2026-04-22 19:24:07.3534238 +0000 UTC m=+20.001438588" observedRunningTime="2026-04-22 19:24:09.19721192 +0000 UTC m=+21.845226716" watchObservedRunningTime="2026-04-22 19:24:09.197882077 +0000 UTC m=+21.845896882" Apr 22 19:24:09.367810 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.367652 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:24:09.660369 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.660337 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:24:09.905472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.905341 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:24:09.367806868Z","UUID":"1cf529f1-1a8d-4496-a382-595ec999421d","Handler":null,"Name":"","Endpoint":""} Apr 22 19:24:09.907877 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.907839 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:24:09.907877 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.907877 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:24:09.947523 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:09.947492 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:09.948194 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:09.947608 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:10.105197 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.105159 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" event={"ID":"7ec5e763de14cf3a8a316ab3ccb4124b","Type":"ContainerStarted","Data":"0e66265fc1f63cd2329ff5c37cc476dfdfb7f2ef13363c3cd50ae61ead933d64"} Apr 22 19:24:10.107557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.107520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" event={"ID":"be6357c3-5a5a-42fe-871f-451ef4ce5f52","Type":"ContainerStarted","Data":"cc71a2e51ede6e7153132ee30c1f2e09757dafc7f789d9df82b500c2a787f596"} Apr 22 19:24:10.122962 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.122904 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-242.ec2.internal" podStartSLOduration=22.122886329 podStartE2EDuration="22.122886329s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:10.122000409 +0000 UTC m=+22.770015214" watchObservedRunningTime="2026-04-22 19:24:10.122886329 +0000 UTC m=+22.770901134" Apr 22 19:24:10.946832 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.946788 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:10.947011 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:10.946917 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:10.986356 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.986320 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jwtj4"] Apr 22 19:24:10.988514 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:10.988488 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:10.988666 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:10.988576 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:11.111910 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.111873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" event={"ID":"be6357c3-5a5a-42fe-871f-451ef4ce5f52","Type":"ContainerStarted","Data":"f06ba2d462c99959805e906e18255e1d5494baafee232c2949cadde255b5c389"} Apr 22 19:24:11.115247 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.115222 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:24:11.115530 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.115501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-dbus\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.115699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.115590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-kubelet-config\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.115699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.115626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.115803 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.115734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"705f03c7dba3b3726ae8c6c2e027ebf498d1591086bf3f3fc0317e2ca486a2b7"} Apr 22 19:24:11.130592 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.130540 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nqzsv" podStartSLOduration=2.177916359 podStartE2EDuration="23.130525958s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.249468917 +0000 UTC m=+1.897483699" lastFinishedPulling="2026-04-22 19:24:10.202078501 +0000 UTC m=+22.850093298" observedRunningTime="2026-04-22 19:24:11.130420362 +0000 UTC m=+23.778435167" watchObservedRunningTime="2026-04-22 19:24:11.130525958 +0000 UTC m=+23.778540786" Apr 22 19:24:11.216971 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.216884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-kubelet-config\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.216971 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.216931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.217218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.217033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-kubelet-config\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.217218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.217050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-dbus\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.217218 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:11.217148 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:11.217218 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:11.217212 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret podName:a3e9c485-cff6-44ce-b842-b27605d809bb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:11.717191348 +0000 UTC m=+24.365206142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret") pod "global-pull-secret-syncer-jwtj4" (UID: "a3e9c485-cff6-44ce-b842-b27605d809bb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:11.217411 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.217354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3e9c485-cff6-44ce-b842-b27605d809bb-dbus\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.721246 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.721202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:11.721411 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:11.721391 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:11.721484 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:11.721472 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret podName:a3e9c485-cff6-44ce-b842-b27605d809bb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:12.721455389 +0000 UTC m=+25.369470186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret") pod "global-pull-secret-syncer-jwtj4" (UID: "a3e9c485-cff6-44ce-b842-b27605d809bb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:11.947388 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:11.947352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:11.947579 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:11.947477 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:12.728530 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:12.728483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:12.729047 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:12.728621 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:12.729047 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:12.728708 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret podName:a3e9c485-cff6-44ce-b842-b27605d809bb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:14.728686228 +0000 UTC m=+27.376701029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret") pod "global-pull-secret-syncer-jwtj4" (UID: "a3e9c485-cff6-44ce-b842-b27605d809bb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:12.946911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:12.946878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:12.947081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:12.946878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:12.947081 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:12.946983 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:12.947081 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:12.947062 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:13.087031 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:13.086849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:24:13.087589 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:13.087567 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:24:13.120084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:13.120064 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pjlfc" Apr 22 19:24:13.947529 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:13.947490 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:13.947958 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:13.947605 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:14.124077 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.124052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:24:14.124455 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.124428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"432b5660debecdf04eb5dba2350247d87457e30b8c231ca3b95a4409f9acdf32"} Apr 22 19:24:14.124722 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.124697 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:14.124820 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.124727 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:14.124926 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.124913 2569 scope.go:117] "RemoveContainer" containerID="3f61b403ccecd428a6e234ce55de7c62af08870a32d7351455324f852922d856" Apr 22 19:24:14.126650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.126623 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="5dc6a8930194e754e6fbc7ff62c311486c7c6218d090ed371e2a7734f9760059" exitCode=0 Apr 22 19:24:14.126727 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.126697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"5dc6a8930194e754e6fbc7ff62c311486c7c6218d090ed371e2a7734f9760059"} Apr 22 19:24:14.141385 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.141359 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:14.744730 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.744466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:14.744882 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:14.744699 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:14.744882 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:14.744810 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret podName:a3e9c485-cff6-44ce-b842-b27605d809bb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:18.744789051 +0000 UTC m=+31.392803837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret") pod "global-pull-secret-syncer-jwtj4" (UID: "a3e9c485-cff6-44ce-b842-b27605d809bb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:14.946951 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.946868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:14.946951 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:14.946919 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:14.947176 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:14.946996 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:14.947249 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:14.947173 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:15.051171 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.050992 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jwtj4"] Apr 22 19:24:15.052513 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.051886 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dx52z"] Apr 22 19:24:15.052560 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.052543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mjd2c"] Apr 22 19:24:15.052670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.052655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:15.052795 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:15.052772 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:15.130460 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.130421 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="352250c3183a309e3a61881c7410228786e65ed1b1b7519a77778beaf31efc1b" exitCode=0 Apr 22 19:24:15.130630 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.130512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"352250c3183a309e3a61881c7410228786e65ed1b1b7519a77778beaf31efc1b"} Apr 22 19:24:15.133985 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.133968 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:24:15.134361 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.134334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" event={"ID":"94c9353c-64db-4c45-9df3-30ea8b6efb63","Type":"ContainerStarted","Data":"033dd2690effe8d844f7844765ca608b5b1816d8aa56751eaaaa49ea78e7403e"} Apr 22 19:24:15.134458 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.134440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:15.134536 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.134524 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:15.134640 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:15.134619 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:15.134734 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:15.134700 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:15.134907 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.134893 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:15.149669 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:15.149644 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:16.138404 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.138365 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="e2af3491d3fc774fc19ba119c41447a870954e85752ae1df63c72fbf589a6167" exitCode=0 Apr 22 19:24:16.138758 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.138443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"e2af3491d3fc774fc19ba119c41447a870954e85752ae1df63c72fbf589a6167"} Apr 22 19:24:16.166943 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.166896 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" podStartSLOduration=9.987133337 podStartE2EDuration="28.166881208s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.246262054 +0000 UTC m=+1.894276836" lastFinishedPulling="2026-04-22 19:24:07.426009925 +0000 UTC m=+20.074024707" observedRunningTime="2026-04-22 19:24:15.187833365 +0000 UTC m=+27.835848163" watchObservedRunningTime="2026-04-22 19:24:16.166881208 +0000 UTC m=+28.814896012" Apr 22 19:24:16.947427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.947400 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:16.947559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.947396 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:16.947559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:16.947535 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:16.947559 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:16.947538 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:16.947679 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:16.947603 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:16.947679 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:16.947668 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:18.778167 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:18.777938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:18.778749 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:18.778134 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:18.778749 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:18.778247 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret podName:a3e9c485-cff6-44ce-b842-b27605d809bb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:26.778228157 +0000 UTC m=+39.426242944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret") pod "global-pull-secret-syncer-jwtj4" (UID: "a3e9c485-cff6-44ce-b842-b27605d809bb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:18.947426 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:18.947391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:18.947426 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:18.947420 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:18.947676 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:18.947442 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:18.947676 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:18.947533 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jwtj4" podUID="a3e9c485-cff6-44ce-b842-b27605d809bb" Apr 22 19:24:18.947779 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:18.947657 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mjd2c" podUID="6897e3de-61a5-4d68-9638-35ac613b4f31" Apr 22 19:24:18.947779 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:18.947752 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:24:19.693455 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.693422 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-242.ec2.internal" event="NodeReady" Apr 22 19:24:19.693643 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.693589 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:19.731424 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.731377 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74968f5d75-g9w5s"] Apr 22 19:24:19.774156 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.774122 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mmxtx"] Apr 22 19:24:19.774451 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.774434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.777082 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.777056 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:24:19.777499 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.777477 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:24:19.777722 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.777705 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:24:19.777959 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.777933 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lccrt\"" Apr 22 19:24:19.782698 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.782679 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:24:19.793324 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.793296 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zgpcw"] Apr 22 19:24:19.793472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.793454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.796434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.796409 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:24:19.796562 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.796409 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:19.796634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.796618 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:19.821037 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.821004 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74968f5d75-g9w5s"] Apr 22 19:24:19.821037 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.821036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mmxtx"] Apr 22 19:24:19.821261 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.821050 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgpcw"] Apr 22 19:24:19.821261 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.821183 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:19.824378 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.824351 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:19.824694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.824670 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:19.825262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.824896 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:19.825262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.825154 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:24:19.887056 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:19.887227 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptksp\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887227 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887227 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887149 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887227 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-tmp-dir\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.887444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887342 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.887634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-config-volume\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.887634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfm8j\" (UniqueName: \"kubernetes.io/projected/99054ff8-b2bf-49da-9d88-9f03b317fea0-kube-api-access-lfm8j\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:19.887634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887546 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.887634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.887578 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5f6c\" (UniqueName: \"kubernetes.io/projected/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-kube-api-access-r5f6c\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-tmp-dir\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-config-volume\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988570 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.988558 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.988576 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-tmp-dir\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfm8j\" (UniqueName: \"kubernetes.io/projected/99054ff8-b2bf-49da-9d88-9f03b317fea0-kube-api-access-lfm8j\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.988642 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.488623803 +0000 UTC m=+33.136638598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:19.988671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5f6c\" (UniqueName: \"kubernetes.io/projected/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-kube-api-access-r5f6c\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.988844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988750 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:19.988844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptksp\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.988844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.988816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.989047 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.988866 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:19.989047 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.988946 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.488930076 +0000 UTC m=+33.136944861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:19.989167 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.989149 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:19.989214 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:19.989191 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.489177747 +0000 UTC m=+33.137192545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:19.989471 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.989446 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-config-volume\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.989584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.989566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.989635 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.989588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.989635 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.989619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.993256 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.993235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.993366 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.993242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.998570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.998521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5f6c\" (UniqueName: \"kubernetes.io/projected/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-kube-api-access-r5f6c\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:19.998867 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.998846 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptksp\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.999452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.999432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:19.999532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:19.999513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfm8j\" (UniqueName: \"kubernetes.io/projected/99054ff8-b2bf-49da-9d88-9f03b317fea0-kube-api-access-lfm8j\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:20.493433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.493397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:20.493624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.493466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:20.493624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.493498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:20.493624 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493568 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:20.493624 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493614 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:20.493624 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493617 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:20.493858 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493635 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:20.493858 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493662 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:21.493641598 +0000 UTC m=+34.141656389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:20.493858 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493686 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:21.493675838 +0000 UTC m=+34.141690627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:20.493858 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.493702 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:21.493693975 +0000 UTC m=+34.141708758 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:20.545542 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.545506 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7"] Apr 22 19:24:20.568874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.568844 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7"] Apr 22 19:24:20.568874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.568873 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr"] Apr 22 19:24:20.579477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.579442 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr"] Apr 22 19:24:20.579617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.579553 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.579617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.579567 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.582838 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.582813 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:24:20.582971 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.582875 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:24:20.582971 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.582884 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:24:20.584010 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.583987 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lfdqn\"" Apr 22 19:24:20.584148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.584134 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:24:20.584840 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.584226 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:24:20.584840 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.584257 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:24:20.584840 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.584286 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:24:20.584840 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.584346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:24:20.594745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.594724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:20.594891 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.594872 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:20.594955 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.594945 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:24:52.594925205 +0000 UTC m=+65.242939992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:20.695675 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.695675 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.695901 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4l2n\" (UniqueName: \"kubernetes.io/projected/988f94f3-b4ce-498d-9c0c-422f36f04ed5-kube-api-access-k4l2n\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.695953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.695953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:20.696045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.695983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.696045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.696015 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklr9\" (UniqueName: \"kubernetes.io/projected/69750687-e7af-4d6a-8178-058215b4f2e5-kube-api-access-dklr9\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.696181 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.696046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69750687-e7af-4d6a-8178-058215b4f2e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.696181 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.696076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.696181 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.696117 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:20.696181 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.696144 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:20.696181 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.696158 2569 projected.go:194] Error preparing data for projected volume kube-api-access-9mx4j for pod openshift-network-diagnostics/network-check-target-mjd2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:20.696400 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:20.696216 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j podName:6897e3de-61a5-4d68-9638-35ac613b4f31 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:52.696196213 +0000 UTC m=+65.344211008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9mx4j" (UniqueName: "kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j") pod "network-check-target-mjd2c" (UID: "6897e3de-61a5-4d68-9638-35ac613b4f31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:20.797290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.797290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.797290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dklr9\" (UniqueName: \"kubernetes.io/projected/69750687-e7af-4d6a-8178-058215b4f2e5-kube-api-access-dklr9\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.797768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69750687-e7af-4d6a-8178-058215b4f2e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.797768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.797768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.797768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.797768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.797511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4l2n\" (UniqueName: \"kubernetes.io/projected/988f94f3-b4ce-498d-9c0c-422f36f04ed5-kube-api-access-k4l2n\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.798478 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.798396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.800483 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.800457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.800613 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.800533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-hub\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.800613 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.800534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.800849 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.800829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69750687-e7af-4d6a-8178-058215b4f2e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.800931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.800888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/988f94f3-b4ce-498d-9c0c-422f36f04ed5-ca\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.807473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.807324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklr9\" (UniqueName: \"kubernetes.io/projected/69750687-e7af-4d6a-8178-058215b4f2e5-kube-api-access-dklr9\") pod \"managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7\" (UID: \"69750687-e7af-4d6a-8178-058215b4f2e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.808362 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.808336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4l2n\" (UniqueName: \"kubernetes.io/projected/988f94f3-b4ce-498d-9c0c-422f36f04ed5-kube-api-access-k4l2n\") pod \"cluster-proxy-proxy-agent-867c6dc468-fkzqr\" (UID: \"988f94f3-b4ce-498d-9c0c-422f36f04ed5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.907944 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.907906 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" Apr 22 19:24:20.921635 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.921606 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:24:20.947113 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.947056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:20.947288 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.947059 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:20.947288 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.947073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:20.950057 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950035 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:24:20.950199 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950063 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:24:20.950199 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950082 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:20.950329 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950309 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:20.950444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950311 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:20.950444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:20.950418 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wrgld\"" Apr 22 19:24:21.503612 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:21.503575 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:21.503649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:21.503696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503759 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503823 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503842 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.503820071 +0000 UTC m=+36.151834854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503761 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503873 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:21.503903 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503879 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.503861805 +0000 UTC m=+36.151876595 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:21.504247 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:21.503932 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.503918894 +0000 UTC m=+36.151933677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:21.916015 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:21.915982 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7"] Apr 22 19:24:21.919047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:21.919025 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr"] Apr 22 19:24:21.982701 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:24:21.982662 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69750687_e7af_4d6a_8178_058215b4f2e5.slice/crio-446dfc2c8d3e8c8ef9a278dcf4b781d842bc2d11ef5a76cee8140b1f1722ab49 WatchSource:0}: Error finding container 446dfc2c8d3e8c8ef9a278dcf4b781d842bc2d11ef5a76cee8140b1f1722ab49: Status 404 returned error can't find the container with id 446dfc2c8d3e8c8ef9a278dcf4b781d842bc2d11ef5a76cee8140b1f1722ab49 Apr 22 19:24:21.982918 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:24:21.982900 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988f94f3_b4ce_498d_9c0c_422f36f04ed5.slice/crio-c218f9cac4a2890b36d36460355053bcc5371aa507e49fff99146d63502910dc WatchSource:0}: Error finding container c218f9cac4a2890b36d36460355053bcc5371aa507e49fff99146d63502910dc: Status 404 returned error can't find the container with id c218f9cac4a2890b36d36460355053bcc5371aa507e49fff99146d63502910dc Apr 22 19:24:22.152260 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:22.152052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" event={"ID":"69750687-e7af-4d6a-8178-058215b4f2e5","Type":"ContainerStarted","Data":"446dfc2c8d3e8c8ef9a278dcf4b781d842bc2d11ef5a76cee8140b1f1722ab49"} Apr 22 19:24:22.153079 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:22.153049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerStarted","Data":"c218f9cac4a2890b36d36460355053bcc5371aa507e49fff99146d63502910dc"} Apr 22 19:24:23.159655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:23.159568 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="e1eed542a0a274482011a62a92c59b5a060a97a4d42f9288698b761053638058" exitCode=0 Apr 22 19:24:23.159655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:23.159639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"e1eed542a0a274482011a62a92c59b5a060a97a4d42f9288698b761053638058"} Apr 22 19:24:23.520081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:23.520032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:23.520271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:23.520152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:23.520271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:23.520194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:23.520386 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520330 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:23.520439 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520394 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:27.52037562 +0000 UTC m=+40.168390406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:23.520765 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520614 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:23.520765 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520674 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:27.52066041 +0000 UTC m=+40.168675202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:23.520765 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520613 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:23.520765 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520697 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:23.520765 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:23.520734 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:27.520724103 +0000 UTC m=+40.168738887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:24.164757 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:24.164719 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2d6d69-a4bd-4d9a-b48c-1f85a054c228" containerID="3459fa2c85fecd5ea5a3580a439fa158e791210a4e058002ec57d0f1ec732e4d" exitCode=0 Apr 22 19:24:24.165373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:24.164790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerDied","Data":"3459fa2c85fecd5ea5a3580a439fa158e791210a4e058002ec57d0f1ec732e4d"} Apr 22 19:24:26.852492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:26.852407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:26.856075 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:26.856052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3e9c485-cff6-44ce-b842-b27605d809bb-original-pull-secret\") pod \"global-pull-secret-syncer-jwtj4\" (UID: \"a3e9c485-cff6-44ce-b842-b27605d809bb\") " pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:26.974775 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:26.974736 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jwtj4" Apr 22 19:24:27.113423 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.113348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jwtj4"] Apr 22 19:24:27.116639 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:24:27.116616 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e9c485_cff6_44ce_b842_b27605d809bb.slice/crio-6a12d3962fff624a515baa7620eb1e0a5610209bf46598ced9ed4fb6a32b939b WatchSource:0}: Error finding container 6a12d3962fff624a515baa7620eb1e0a5610209bf46598ced9ed4fb6a32b939b: Status 404 returned error can't find the container with id 6a12d3962fff624a515baa7620eb1e0a5610209bf46598ced9ed4fb6a32b939b Apr 22 19:24:27.174240 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.174202 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" event={"ID":"bf2d6d69-a4bd-4d9a-b48c-1f85a054c228","Type":"ContainerStarted","Data":"3df13d2ba4e3652d2dbcc821db7e9222f54a2ac5e6c1a0ffdd1686016fed5e6c"} Apr 22 19:24:27.175441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.175410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" event={"ID":"69750687-e7af-4d6a-8178-058215b4f2e5","Type":"ContainerStarted","Data":"6c2562c023f3c1c9f000c4c8126764b6aa1a0f962f5f7fa0616aced61abf4d1b"} Apr 22 19:24:27.176640 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.176606 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerStarted","Data":"9309e598c12264bb9ca9a2860135194f0d31da9f60de2c05e0c2dea3bd3e5a50"} Apr 22 19:24:27.177419 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.177402 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jwtj4" event={"ID":"a3e9c485-cff6-44ce-b842-b27605d809bb","Type":"ContainerStarted","Data":"6a12d3962fff624a515baa7620eb1e0a5610209bf46598ced9ed4fb6a32b939b"} Apr 22 19:24:27.200576 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.200526 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x9sbh" podStartSLOduration=6.359308712 podStartE2EDuration="39.200501956s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:23:49.167875141 +0000 UTC m=+1.815889922" lastFinishedPulling="2026-04-22 19:24:22.009068384 +0000 UTC m=+34.657083166" observedRunningTime="2026-04-22 19:24:27.198953499 +0000 UTC m=+39.846968303" watchObservedRunningTime="2026-04-22 19:24:27.200501956 +0000 UTC m=+39.848516759" Apr 22 19:24:27.214742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.214689 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" podStartSLOduration=2.7081694990000003 podStartE2EDuration="7.214676667s" podCreationTimestamp="2026-04-22 19:24:20 +0000 UTC" firstStartedPulling="2026-04-22 19:24:21.987682366 +0000 UTC m=+34.635697151" lastFinishedPulling="2026-04-22 19:24:26.494189527 +0000 UTC m=+39.142204319" observedRunningTime="2026-04-22 19:24:27.214239447 +0000 UTC m=+39.862254250" watchObservedRunningTime="2026-04-22 19:24:27.214676667 +0000 UTC m=+39.862691471" Apr 22 19:24:27.559338 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.559291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:27.559516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.559379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:27.559516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:27.559472 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:27.559516 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559481 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:27.559516 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559505 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:27.559693 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559517 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:27.559693 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559570 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:35.559550447 +0000 UTC m=+48.207565231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:27.559693 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559586 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:35.559578008 +0000 UTC m=+48.207592789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:27.559693 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559677 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:27.559853 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:27.559736 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:35.559719408 +0000 UTC m=+48.207734193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:31.188340 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:31.188307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerStarted","Data":"543ad005e251280150f5be196183654c547040fd230be06ae394ebf365be9161"} Apr 22 19:24:32.191842 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:32.191806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerStarted","Data":"48570c72b69eeaab81282cede03e0619d7411cca4be91afec7399185f14bcbbb"} Apr 22 19:24:32.193213 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:32.193188 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jwtj4" event={"ID":"a3e9c485-cff6-44ce-b842-b27605d809bb","Type":"ContainerStarted","Data":"e96ac3d4f698d85baa89cc809561adad0d5b4475cc6c70ea6e6476e29c34b102"} Apr 22 19:24:32.212070 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:32.212018 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" podStartSLOduration=3.111832851 podStartE2EDuration="12.212005016s" podCreationTimestamp="2026-04-22 19:24:20 +0000 UTC" firstStartedPulling="2026-04-22 19:24:21.987601078 +0000 UTC m=+34.635615862" lastFinishedPulling="2026-04-22 19:24:31.087773245 +0000 UTC m=+43.735788027" observedRunningTime="2026-04-22 19:24:32.210423514 +0000 UTC m=+44.858438341" watchObservedRunningTime="2026-04-22 19:24:32.212005016 +0000 UTC m=+44.860019819" Apr 22 19:24:32.226312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:32.226272 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jwtj4" podStartSLOduration=17.899742409 podStartE2EDuration="22.22625887s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:27.118278429 +0000 UTC m=+39.766293214" lastFinishedPulling="2026-04-22 19:24:31.444794893 +0000 UTC m=+44.092809675" observedRunningTime="2026-04-22 19:24:32.225791082 +0000 UTC m=+44.873805888" watchObservedRunningTime="2026-04-22 19:24:32.22625887 +0000 UTC m=+44.874273658" Apr 22 19:24:35.629658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:35.629613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:35.629682 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:35.629725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629797 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629815 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629822 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629833 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629896 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:24:51.629874602 +0000 UTC m=+64.277889398 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629916 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:51.629907409 +0000 UTC m=+64.277922201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:35.630152 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:35.629931 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:51.629923134 +0000 UTC m=+64.277937916 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:47.152291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:47.152257 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99kk8" Apr 22 19:24:51.636501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:51.636456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:24:51.636501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:51.636508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:51.636545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636623 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636630 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636654 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636673 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:23.636659449 +0000 UTC m=+96.284674231 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636723 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:25:23.636705295 +0000 UTC m=+96.284720078 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636725 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:51.637003 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:51.636785 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:23.63677441 +0000 UTC m=+96.284789197 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:24:52.644035 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.643991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:24:52.646695 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.646678 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:52.654972 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:52.654953 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:52.655027 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:24:52.655009 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:25:56.65499442 +0000 UTC m=+129.303009202 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : secret "metrics-daemon-secret" not found Apr 22 19:24:52.744590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.744554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:52.747515 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.747497 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:52.757145 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.757125 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:52.768340 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.768314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mx4j\" (UniqueName: \"kubernetes.io/projected/6897e3de-61a5-4d68-9638-35ac613b4f31-kube-api-access-9mx4j\") pod \"network-check-target-mjd2c\" (UID: \"6897e3de-61a5-4d68-9638-35ac613b4f31\") " pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:52.790680 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.790657 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wrgld\"" Apr 22 19:24:52.799003 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.798985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:52.932161 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:52.932053 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mjd2c"] Apr 22 19:24:52.937993 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:24:52.937966 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6897e3de_61a5_4d68_9638_35ac613b4f31.slice/crio-4946555805c14f98dcc719ca6b834a8728b9539ccee2d72af24aa5f4871447b1 WatchSource:0}: Error finding container 4946555805c14f98dcc719ca6b834a8728b9539ccee2d72af24aa5f4871447b1: Status 404 returned error can't find the container with id 4946555805c14f98dcc719ca6b834a8728b9539ccee2d72af24aa5f4871447b1 Apr 22 19:24:53.235041 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:53.234987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mjd2c" event={"ID":"6897e3de-61a5-4d68-9638-35ac613b4f31","Type":"ContainerStarted","Data":"4946555805c14f98dcc719ca6b834a8728b9539ccee2d72af24aa5f4871447b1"} Apr 22 19:24:56.243131 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:56.243081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mjd2c" event={"ID":"6897e3de-61a5-4d68-9638-35ac613b4f31","Type":"ContainerStarted","Data":"9ee104dd3575e444be02ee7a95119099ab730aff38bd416b824c16ae478c63d8"} Apr 22 19:24:56.243564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:56.243225 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:24:56.261304 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:24:56.261257 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mjd2c" podStartSLOduration=65.115912038 podStartE2EDuration="1m8.261244594s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:24:52.940222058 +0000 UTC m=+65.588236843" lastFinishedPulling="2026-04-22 19:24:56.085554617 +0000 UTC m=+68.733569399" observedRunningTime="2026-04-22 19:24:56.260627213 +0000 UTC m=+68.908642017" watchObservedRunningTime="2026-04-22 19:24:56.261244594 +0000 UTC m=+68.909259394" Apr 22 19:25:23.672732 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:25:23.672693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:25:23.672752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:25:23.672807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672863 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672888 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672904 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672933 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672966 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:26:27.672947279 +0000 UTC m=+160.320962079 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.672989 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:27.67297219 +0000 UTC m=+160.320986990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:25:23.673317 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:23.673007 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:27.672998263 +0000 UTC m=+160.321013050 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:25:27.249025 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:25:27.248994 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mjd2c" Apr 22 19:25:56.712206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:25:56.712168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:25:56.712805 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:56.712316 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:56.712805 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:25:56.712397 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs podName:f4583537-f5a4-4201-a5ba-5c41cf04b3da nodeName:}" failed. No retries permitted until 2026-04-22 19:27:58.712379712 +0000 UTC m=+251.360394494 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs") pod "network-metrics-daemon-dx52z" (UID: "f4583537-f5a4-4201-a5ba-5c41cf04b3da") : secret "metrics-daemon-secret" not found Apr 22 19:26:21.913608 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:21.913576 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rtnc_51e23bda-7f24-43f3-9b0b-9e0f8a95c02f/dns-node-resolver/0.log" Apr 22 19:26:22.788719 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:22.788669 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" podUID="19316983-25d3-46c3-a82f-273e8d4421aa" Apr 22 19:26:22.805844 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:22.805805 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mmxtx" podUID="7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78" Apr 22 19:26:22.833326 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:22.833287 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zgpcw" podUID="99054ff8-b2bf-49da-9d88-9f03b317fea0" Apr 22 19:26:22.902767 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:22.902744 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6b4cg_3dc70558-ecae-4e50-82a2-3b1c70e5cfb2/node-ca/0.log" Apr 22 19:26:23.453348 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:23.453318 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:26:23.453705 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:23.453318 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:26:23.981338 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:23.981285 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dx52z" podUID="f4583537-f5a4-4201-a5ba-5c41cf04b3da" Apr 22 19:26:27.463988 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.463954 2569 generic.go:358] "Generic (PLEG): container finished" podID="69750687-e7af-4d6a-8178-058215b4f2e5" containerID="6c2562c023f3c1c9f000c4c8126764b6aa1a0f962f5f7fa0616aced61abf4d1b" exitCode=255 Apr 22 19:26:27.464556 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.464027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" event={"ID":"69750687-e7af-4d6a-8178-058215b4f2e5","Type":"ContainerDied","Data":"6c2562c023f3c1c9f000c4c8126764b6aa1a0f962f5f7fa0616aced61abf4d1b"} Apr 22 19:26:27.464556 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.464343 2569 scope.go:117] "RemoveContainer" containerID="6c2562c023f3c1c9f000c4c8126764b6aa1a0f962f5f7fa0616aced61abf4d1b" Apr 22 19:26:27.750557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.750452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") pod \"image-registry-74968f5d75-g9w5s\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:26:27.750557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.750511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:27.750560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750609 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750635 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74968f5d75-g9w5s: secret "image-registry-tls" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750660 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750660 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750695 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls podName:19316983-25d3-46c3-a82f-273e8d4421aa nodeName:}" failed. No retries permitted until 2026-04-22 19:28:29.750675017 +0000 UTC m=+282.398689800 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls") pod "image-registry-74968f5d75-g9w5s" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa") : secret "image-registry-tls" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750710 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert podName:99054ff8-b2bf-49da-9d88-9f03b317fea0 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:29.750703579 +0000 UTC m=+282.398718361 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert") pod "ingress-canary-zgpcw" (UID: "99054ff8-b2bf-49da-9d88-9f03b317fea0") : secret "canary-serving-cert" not found Apr 22 19:26:27.750794 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:27.750722 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls podName:7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:29.750715383 +0000 UTC m=+282.398730164 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls") pod "dns-default-mmxtx" (UID: "7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78") : secret "dns-default-metrics-tls" not found Apr 22 19:26:28.470392 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:28.470354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84c4bbd6d7-jf2p7" event={"ID":"69750687-e7af-4d6a-8178-058215b4f2e5","Type":"ContainerStarted","Data":"97ed0ff72fdf29f6c25451f3500cb6b77d65e9926394874d2b0ad4a9daf3fd43"} Apr 22 19:26:37.948593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:37.948509 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:26:37.948948 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:37.948686 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:26:53.310811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.310777 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d55df"] Apr 22 19:26:53.312692 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.312676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.316436 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316411 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74968f5d75-g9w5s"] Apr 22 19:26:53.316645 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316611 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:53.316723 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:26:53.316648 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" podUID="19316983-25d3-46c3-a82f-273e8d4421aa" Apr 22 19:26:53.316723 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316681 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:26:53.316723 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316693 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:53.316870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316682 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr677\"" Apr 22 19:26:53.316870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.316771 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:26:53.337160 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.337131 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d55df"] Apr 22 19:26:53.343544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.343517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a39dd47-7813-45d4-bf4e-249d40368c54-data-volume\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.343652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.343594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a39dd47-7813-45d4-bf4e-249d40368c54-crio-socket\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.343652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.343617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rwf4\" (UniqueName: \"kubernetes.io/projected/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-api-access-2rwf4\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.343652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.343640 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.343749 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.343717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a39dd47-7813-45d4-bf4e-249d40368c54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444403 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a39dd47-7813-45d4-bf4e-249d40368c54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a39dd47-7813-45d4-bf4e-249d40368c54-data-volume\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444611 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a39dd47-7813-45d4-bf4e-249d40368c54-crio-socket\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444645 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rwf4\" (UniqueName: \"kubernetes.io/projected/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-api-access-2rwf4\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444645 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444633 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a39dd47-7813-45d4-bf4e-249d40368c54-crio-socket\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.444785 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.444735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a39dd47-7813-45d4-bf4e-249d40368c54-data-volume\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.445039 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.445020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.446932 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.446913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a39dd47-7813-45d4-bf4e-249d40368c54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.458349 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.458319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rwf4\" (UniqueName: \"kubernetes.io/projected/2a39dd47-7813-45d4-bf4e-249d40368c54-kube-api-access-2rwf4\") pod \"insights-runtime-extractor-d55df\" (UID: \"2a39dd47-7813-45d4-bf4e-249d40368c54\") " pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.529460 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.527586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:26:53.533700 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.533676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:26:53.545105 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545075 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptksp\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545247 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545125 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545247 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545144 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545362 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545344 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545411 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545391 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545463 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545441 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.545549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545522 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:53.545768 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545739 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:26:53.546013 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545757 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-registry-certificates\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.546013 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.545801 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:53.547594 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.547564 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:53.547690 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.547618 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp" (OuterVolumeSpecName: "kube-api-access-ptksp") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "kube-api-access-ptksp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:53.547690 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.547622 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:53.621603 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.621509 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d55df" Apr 22 19:26:53.646447 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646415 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token\") pod \"19316983-25d3-46c3-a82f-273e8d4421aa\" (UID: \"19316983-25d3-46c3-a82f-273e8d4421aa\") " Apr 22 19:26:53.646688 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646670 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19316983-25d3-46c3-a82f-273e8d4421aa-ca-trust-extracted\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.646728 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646697 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptksp\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-kube-api-access-ptksp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.646728 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646712 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-installation-pull-secrets\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.646796 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646728 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19316983-25d3-46c3-a82f-273e8d4421aa-image-registry-private-configuration\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.646796 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.646746 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19316983-25d3-46c3-a82f-273e8d4421aa-trusted-ca\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.648557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.648535 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "19316983-25d3-46c3-a82f-273e8d4421aa" (UID: "19316983-25d3-46c3-a82f-273e8d4421aa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:53.747222 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.747177 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-bound-sa-token\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:53.748001 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:53.747976 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d55df"] Apr 22 19:26:53.751116 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:26:53.751066 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a39dd47_7813_45d4_bf4e_249d40368c54.slice/crio-f3de619353d07ba44a47952b2d4079e89f185956bec2ebfd98052fe6b6f7e68d WatchSource:0}: Error finding container f3de619353d07ba44a47952b2d4079e89f185956bec2ebfd98052fe6b6f7e68d: Status 404 returned error can't find the container with id f3de619353d07ba44a47952b2d4079e89f185956bec2ebfd98052fe6b6f7e68d Apr 22 19:26:54.531480 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.531446 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d55df" event={"ID":"2a39dd47-7813-45d4-bf4e-249d40368c54","Type":"ContainerStarted","Data":"886e3030f6645d25691e9154fcd414122dd6d307ed509bef5d8581f45367396e"} Apr 22 19:26:54.531480 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.531474 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74968f5d75-g9w5s" Apr 22 19:26:54.531480 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.531485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d55df" event={"ID":"2a39dd47-7813-45d4-bf4e-249d40368c54","Type":"ContainerStarted","Data":"65ba979237adfede6415234fa63bbfc6c070979427ddb1979c38ee2756bce52e"} Apr 22 19:26:54.531964 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.531500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d55df" event={"ID":"2a39dd47-7813-45d4-bf4e-249d40368c54","Type":"ContainerStarted","Data":"f3de619353d07ba44a47952b2d4079e89f185956bec2ebfd98052fe6b6f7e68d"} Apr 22 19:26:54.596780 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.596745 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74968f5d75-g9w5s"] Apr 22 19:26:54.602608 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.602579 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-74968f5d75-g9w5s"] Apr 22 19:26:54.653070 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:54.653039 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19316983-25d3-46c3-a82f-273e8d4421aa-registry-tls\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:26:55.950633 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:55.950601 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19316983-25d3-46c3-a82f-273e8d4421aa" path="/var/lib/kubelet/pods/19316983-25d3-46c3-a82f-273e8d4421aa/volumes" Apr 22 19:26:56.538121 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:56.538062 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d55df" event={"ID":"2a39dd47-7813-45d4-bf4e-249d40368c54","Type":"ContainerStarted","Data":"95f50e746b80faf5dfa2efc68adcda385a2db6afb1c56b7295178d54f7ec86e2"} Apr 22 19:26:56.563666 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:26:56.563609 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d55df" podStartSLOduration=1.48912953 podStartE2EDuration="3.56359283s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="2026-04-22 19:26:53.817223127 +0000 UTC m=+186.465237909" lastFinishedPulling="2026-04-22 19:26:55.891686426 +0000 UTC m=+188.539701209" observedRunningTime="2026-04-22 19:26:56.561806632 +0000 UTC m=+189.209821433" watchObservedRunningTime="2026-04-22 19:26:56.56359283 +0000 UTC m=+189.211607631" Apr 22 19:27:00.923273 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:00.923201 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" podUID="988f94f3-b4ce-498d-9c0c-422f36f04ed5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:27:02.068968 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.068932 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k"] Apr 22 19:27:02.071804 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.071781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.074547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.074521 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:27:02.074661 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.074521 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:27:02.075791 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.075770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:27:02.075894 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.075791 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:27:02.075894 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.075779 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-srv7x\"" Apr 22 19:27:02.075894 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.075887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:27:02.083063 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.083038 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k"] Apr 22 19:27:02.100624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.100585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txsg\" (UniqueName: \"kubernetes.io/projected/8f432698-4844-4c25-b51c-849193e9c061-kube-api-access-5txsg\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.100624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.100628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.100824 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.100653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.100824 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.100708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f432698-4844-4c25-b51c-849193e9c061-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.128774 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.128744 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-q4gf4"] Apr 22 19:27:02.130810 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.130793 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.138725 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.138699 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:27:02.138725 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.138724 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:27:02.142250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.142233 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:27:02.157510 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.157478 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-b8znt\"" Apr 22 19:27:02.201265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201221 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-accelerators-collector-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201266 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-sys\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-root\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-textfile\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-wtmp\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5txsg\" (UniqueName: \"kubernetes.io/projected/8f432698-4844-4c25-b51c-849193e9c061-kube-api-access-5txsg\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-metrics-client-ca\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.201969 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f432698-4844-4c25-b51c-849193e9c061-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.201969 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:02.201613 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 19:27:02.201969 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.201653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rfw\" (UniqueName: \"kubernetes.io/projected/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-kube-api-access-n9rfw\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.201969 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:02.201690 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls podName:8f432698-4844-4c25-b51c-849193e9c061 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:02.701667556 +0000 UTC m=+195.349682364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-9cf6k" (UID: "8f432698-4844-4c25-b51c-849193e9c061") : secret "openshift-state-metrics-tls" not found Apr 22 19:27:02.202223 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.202204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f432698-4844-4c25-b51c-849193e9c061-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.204058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.204040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.214812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.214786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txsg\" (UniqueName: \"kubernetes.io/projected/8f432698-4844-4c25-b51c-849193e9c061-kube-api-access-5txsg\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.302752 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.302710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-metrics-client-ca\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.302752 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.302749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.302802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rfw\" (UniqueName: \"kubernetes.io/projected/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-kube-api-access-n9rfw\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.302862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-accelerators-collector-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.302992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-sys\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-sys\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:02.303141 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-root\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-textfile\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303241 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:02.303205 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls podName:b3a61eaf-ba59-4d3f-97cc-68c70e44c797 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:02.803187978 +0000 UTC m=+195.451202761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls") pod "node-exporter-q4gf4" (UID: "b3a61eaf-ba59-4d3f-97cc-68c70e44c797") : secret "node-exporter-tls" not found Apr 22 19:27:02.303561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-wtmp\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-root\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-wtmp\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-textfile\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303485 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-metrics-client-ca\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.303717 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.303565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-accelerators-collector-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.305359 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.305338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.315278 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.315251 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rfw\" (UniqueName: \"kubernetes.io/projected/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-kube-api-access-n9rfw\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.706608 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.706569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.709148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.709083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f432698-4844-4c25-b51c-849193e9c061-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9cf6k\" (UID: \"8f432698-4844-4c25-b51c-849193e9c061\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:02.807410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.807373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.809798 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.809769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b3a61eaf-ba59-4d3f-97cc-68c70e44c797-node-exporter-tls\") pod \"node-exporter-q4gf4\" (UID: \"b3a61eaf-ba59-4d3f-97cc-68c70e44c797\") " pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:02.981255 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:02.981166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" Apr 22 19:27:03.041217 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.040724 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q4gf4" Apr 22 19:27:03.052643 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:27:03.052604 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3a61eaf_ba59_4d3f_97cc_68c70e44c797.slice/crio-9177b86a6d1d7dd5edbc30b7dff6415cd394204a3b8a272f9c14facd6bf5f81e WatchSource:0}: Error finding container 9177b86a6d1d7dd5edbc30b7dff6415cd394204a3b8a272f9c14facd6bf5f81e: Status 404 returned error can't find the container with id 9177b86a6d1d7dd5edbc30b7dff6415cd394204a3b8a272f9c14facd6bf5f81e Apr 22 19:27:03.121931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.121894 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k"] Apr 22 19:27:03.125846 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:27:03.125817 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f432698_4844_4c25_b51c_849193e9c061.slice/crio-f10e9d2f4eed1e9fb599155840105c0e4047134ca22948a50bd6d2ace301aee0 WatchSource:0}: Error finding container f10e9d2f4eed1e9fb599155840105c0e4047134ca22948a50bd6d2ace301aee0: Status 404 returned error can't find the container with id f10e9d2f4eed1e9fb599155840105c0e4047134ca22948a50bd6d2ace301aee0 Apr 22 19:27:03.555300 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.555208 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q4gf4" event={"ID":"b3a61eaf-ba59-4d3f-97cc-68c70e44c797","Type":"ContainerStarted","Data":"9177b86a6d1d7dd5edbc30b7dff6415cd394204a3b8a272f9c14facd6bf5f81e"} Apr 22 19:27:03.556887 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.556855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" event={"ID":"8f432698-4844-4c25-b51c-849193e9c061","Type":"ContainerStarted","Data":"7589b5e0039095f3518b8a1f0498f51ec4de656f9a2772eb9b770c47ba99e126"} Apr 22 19:27:03.557013 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.556895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" event={"ID":"8f432698-4844-4c25-b51c-849193e9c061","Type":"ContainerStarted","Data":"b144c0783758ebef9c126f97831c32f4446ab62ac250e8a6ed7f3110e1b3c5e9"} Apr 22 19:27:03.557013 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:03.556910 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" event={"ID":"8f432698-4844-4c25-b51c-849193e9c061","Type":"ContainerStarted","Data":"f10e9d2f4eed1e9fb599155840105c0e4047134ca22948a50bd6d2ace301aee0"} Apr 22 19:27:04.561739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:04.561704 2569 generic.go:358] "Generic (PLEG): container finished" podID="b3a61eaf-ba59-4d3f-97cc-68c70e44c797" containerID="a94421ae9389ca25a8acda2e8ce22e936fc4046d23bafff2bfccc1fe627c9805" exitCode=0 Apr 22 19:27:04.562186 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:04.561779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q4gf4" event={"ID":"b3a61eaf-ba59-4d3f-97cc-68c70e44c797","Type":"ContainerDied","Data":"a94421ae9389ca25a8acda2e8ce22e936fc4046d23bafff2bfccc1fe627c9805"} Apr 22 19:27:04.563850 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:04.563824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" event={"ID":"8f432698-4844-4c25-b51c-849193e9c061","Type":"ContainerStarted","Data":"86b0c1dc98fed0d18cad773bd03e9f2f9c09dd49ed90d93425400b7a6250d9fb"} Apr 22 19:27:04.604544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:04.604498 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9cf6k" podStartSLOduration=1.390995027 podStartE2EDuration="2.604482562s" podCreationTimestamp="2026-04-22 19:27:02 +0000 UTC" firstStartedPulling="2026-04-22 19:27:03.254200072 +0000 UTC m=+195.902214853" lastFinishedPulling="2026-04-22 19:27:04.467687606 +0000 UTC m=+197.115702388" observedRunningTime="2026-04-22 19:27:04.604421691 +0000 UTC m=+197.252436497" watchObservedRunningTime="2026-04-22 19:27:04.604482562 +0000 UTC m=+197.252497366" Apr 22 19:27:05.567823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:05.567782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q4gf4" event={"ID":"b3a61eaf-ba59-4d3f-97cc-68c70e44c797","Type":"ContainerStarted","Data":"57dcff899d4278026f680cb0b69ec1226032bec08c5b5ebd90d4dafc61936b5b"} Apr 22 19:27:05.567823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:05.567826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q4gf4" event={"ID":"b3a61eaf-ba59-4d3f-97cc-68c70e44c797","Type":"ContainerStarted","Data":"ecf51f4cbf8020509a4175474d559241a8e17639f16e8673d530cc717bccb31f"} Apr 22 19:27:05.590291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:05.590244 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-q4gf4" podStartSLOduration=2.823622832 podStartE2EDuration="3.590228448s" podCreationTimestamp="2026-04-22 19:27:02 +0000 UTC" firstStartedPulling="2026-04-22 19:27:03.054622497 +0000 UTC m=+195.702637279" lastFinishedPulling="2026-04-22 19:27:03.821228096 +0000 UTC m=+196.469242895" observedRunningTime="2026-04-22 19:27:05.588970994 +0000 UTC m=+198.236985824" watchObservedRunningTime="2026-04-22 19:27:05.590228448 +0000 UTC m=+198.238243251" Apr 22 19:27:06.845990 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.845951 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h826t"] Apr 22 19:27:06.848126 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.848090 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:06.850746 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.850726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-2h65c\"" Apr 22 19:27:06.851714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.851684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:27:06.859627 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.859606 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h826t"] Apr 22 19:27:06.941277 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:06.941241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h826t\" (UID: \"3b4d0dd1-0012-4f37-aedf-467520762f8d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:07.042462 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:07.042420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h826t\" (UID: \"3b4d0dd1-0012-4f37-aedf-467520762f8d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:07.042621 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:07.042567 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 19:27:07.042660 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:27:07.042639 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert podName:3b4d0dd1-0012-4f37-aedf-467520762f8d nodeName:}" failed. No retries permitted until 2026-04-22 19:27:07.542624724 +0000 UTC m=+200.190639506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-h826t" (UID: "3b4d0dd1-0012-4f37-aedf-467520762f8d") : secret "monitoring-plugin-cert" not found Apr 22 19:27:07.546071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:07.546027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h826t\" (UID: \"3b4d0dd1-0012-4f37-aedf-467520762f8d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:07.548625 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:07.548601 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3b4d0dd1-0012-4f37-aedf-467520762f8d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h826t\" (UID: \"3b4d0dd1-0012-4f37-aedf-467520762f8d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:07.756852 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:07.756808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:07.879127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:07.879074 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h826t"] Apr 22 19:27:07.882325 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:27:07.882293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4d0dd1_0012_4f37_aedf_467520762f8d.slice/crio-fd7711317b6e8967ea6fae67084bd5e0fea5f7cc80864bd51bde5ae3b1973183 WatchSource:0}: Error finding container fd7711317b6e8967ea6fae67084bd5e0fea5f7cc80864bd51bde5ae3b1973183: Status 404 returned error can't find the container with id fd7711317b6e8967ea6fae67084bd5e0fea5f7cc80864bd51bde5ae3b1973183 Apr 22 19:27:08.577578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:08.577536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" event={"ID":"3b4d0dd1-0012-4f37-aedf-467520762f8d","Type":"ContainerStarted","Data":"fd7711317b6e8967ea6fae67084bd5e0fea5f7cc80864bd51bde5ae3b1973183"} Apr 22 19:27:09.581901 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:09.581811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" event={"ID":"3b4d0dd1-0012-4f37-aedf-467520762f8d","Type":"ContainerStarted","Data":"c8aea37cd40e4921e2029b8cb684e665671da3b00efae949c31422cc7f95459d"} Apr 22 19:27:09.582351 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:09.582023 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:09.586894 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:09.586871 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" Apr 22 19:27:09.598341 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:09.598299 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h826t" podStartSLOduration=2.271913976 podStartE2EDuration="3.598286633s" podCreationTimestamp="2026-04-22 19:27:06 +0000 UTC" firstStartedPulling="2026-04-22 19:27:07.884166926 +0000 UTC m=+200.532181708" lastFinishedPulling="2026-04-22 19:27:09.210539566 +0000 UTC m=+201.858554365" observedRunningTime="2026-04-22 19:27:09.597997393 +0000 UTC m=+202.246012196" watchObservedRunningTime="2026-04-22 19:27:09.598286633 +0000 UTC m=+202.246301477" Apr 22 19:27:10.923531 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:10.923491 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" podUID="988f94f3-b4ce-498d-9c0c-422f36f04ed5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:27:20.922979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:20.922934 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" podUID="988f94f3-b4ce-498d-9c0c-422f36f04ed5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:27:20.923445 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:20.923025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" Apr 22 19:27:20.923558 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:20.923526 2569 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"48570c72b69eeaab81282cede03e0619d7411cca4be91afec7399185f14bcbbb"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 19:27:20.923604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:20.923590 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" podUID="988f94f3-b4ce-498d-9c0c-422f36f04ed5" containerName="service-proxy" containerID="cri-o://48570c72b69eeaab81282cede03e0619d7411cca4be91afec7399185f14bcbbb" gracePeriod=30 Apr 22 19:27:21.614957 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:21.614920 2569 generic.go:358] "Generic (PLEG): container finished" podID="988f94f3-b4ce-498d-9c0c-422f36f04ed5" containerID="48570c72b69eeaab81282cede03e0619d7411cca4be91afec7399185f14bcbbb" exitCode=2 Apr 22 19:27:21.615147 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:21.614979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerDied","Data":"48570c72b69eeaab81282cede03e0619d7411cca4be91afec7399185f14bcbbb"} Apr 22 19:27:21.615147 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:21.615005 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867c6dc468-fkzqr" event={"ID":"988f94f3-b4ce-498d-9c0c-422f36f04ed5","Type":"ContainerStarted","Data":"27521fb2f491161e3bfd076e836240d3b8a91b8301d4608491b0906ff240b693"} Apr 22 19:27:37.390559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:37.390528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rtnc_51e23bda-7f24-43f3-9b0b-9e0f8a95c02f/dns-node-resolver/0.log" Apr 22 19:27:58.760703 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:58.760657 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:27:58.763322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:58.763300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4583537-f5a4-4201-a5ba-5c41cf04b3da-metrics-certs\") pod \"network-metrics-daemon-dx52z\" (UID: \"f4583537-f5a4-4201-a5ba-5c41cf04b3da\") " pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:27:58.952041 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:58.952007 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:27:58.960195 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:58.960170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dx52z" Apr 22 19:27:59.084769 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:59.084736 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dx52z"] Apr 22 19:27:59.087896 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:27:59.087867 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4583537_f5a4_4201_a5ba_5c41cf04b3da.slice/crio-a59c4d6df095006e03e289bdf1e1dc46e668475ef42abe98fda90fc19aaf3021 WatchSource:0}: Error finding container a59c4d6df095006e03e289bdf1e1dc46e668475ef42abe98fda90fc19aaf3021: Status 404 returned error can't find the container with id a59c4d6df095006e03e289bdf1e1dc46e668475ef42abe98fda90fc19aaf3021 Apr 22 19:27:59.715334 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:27:59.715288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dx52z" event={"ID":"f4583537-f5a4-4201-a5ba-5c41cf04b3da","Type":"ContainerStarted","Data":"a59c4d6df095006e03e289bdf1e1dc46e668475ef42abe98fda90fc19aaf3021"} Apr 22 19:28:00.719087 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:00.719049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dx52z" event={"ID":"f4583537-f5a4-4201-a5ba-5c41cf04b3da","Type":"ContainerStarted","Data":"7239a5684860ec78f409af0825c41ea75a1365213d9ae789a98219252cca14ea"} Apr 22 19:28:00.719483 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:00.719091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dx52z" event={"ID":"f4583537-f5a4-4201-a5ba-5c41cf04b3da","Type":"ContainerStarted","Data":"1ddbf3d462d3c9163c4e3fe6aa35f4cdeee1405208ba73a35b5c0f074f10dc8e"} Apr 22 19:28:00.737757 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:00.737700 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dx52z" podStartSLOduration=251.812771772 podStartE2EDuration="4m12.73768437s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:27:59.08973769 +0000 UTC m=+251.737752474" lastFinishedPulling="2026-04-22 19:28:00.01465029 +0000 UTC m=+252.662665072" observedRunningTime="2026-04-22 19:28:00.736660268 +0000 UTC m=+253.384675073" watchObservedRunningTime="2026-04-22 19:28:00.73768437 +0000 UTC m=+253.385699174" Apr 22 19:28:26.454599 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:28:26.454530 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mmxtx" podUID="7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78" Apr 22 19:28:26.786832 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:26.786746 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:29.789829 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.789787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:28:29.789829 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.789833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:29.792484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.792460 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99054ff8-b2bf-49da-9d88-9f03b317fea0-cert\") pod \"ingress-canary-zgpcw\" (UID: \"99054ff8-b2bf-49da-9d88-9f03b317fea0\") " pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:28:29.792616 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.792559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78-metrics-tls\") pod \"dns-default-mmxtx\" (UID: \"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78\") " pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:29.851929 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.851898 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:28:29.859231 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.859208 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgpcw" Apr 22 19:28:29.981326 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:29.981293 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgpcw"] Apr 22 19:28:29.984457 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:28:29.984423 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99054ff8_b2bf_49da_9d88_9f03b317fea0.slice/crio-64bf243c19a2cc1b4ff41ac6e1d6cc8fdab8849a065300887c9e5c5632b4f61d WatchSource:0}: Error finding container 64bf243c19a2cc1b4ff41ac6e1d6cc8fdab8849a065300887c9e5c5632b4f61d: Status 404 returned error can't find the container with id 64bf243c19a2cc1b4ff41ac6e1d6cc8fdab8849a065300887c9e5c5632b4f61d Apr 22 19:28:30.091019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:30.090939 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:28:30.098185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:30.098131 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:30.230556 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:30.230525 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mmxtx"] Apr 22 19:28:30.234242 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:28:30.234214 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1ab1b7_70c4_4d99_8e11_7ae79f3b3c78.slice/crio-f61829b32df5bc63e23e31b86f065e26c2b4ae7120fce6ae7574beb5cb907ec9 WatchSource:0}: Error finding container f61829b32df5bc63e23e31b86f065e26c2b4ae7120fce6ae7574beb5cb907ec9: Status 404 returned error can't find the container with id f61829b32df5bc63e23e31b86f065e26c2b4ae7120fce6ae7574beb5cb907ec9 Apr 22 19:28:30.797733 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:30.797481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmxtx" event={"ID":"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78","Type":"ContainerStarted","Data":"f61829b32df5bc63e23e31b86f065e26c2b4ae7120fce6ae7574beb5cb907ec9"} Apr 22 19:28:30.798719 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:30.798688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgpcw" event={"ID":"99054ff8-b2bf-49da-9d88-9f03b317fea0","Type":"ContainerStarted","Data":"64bf243c19a2cc1b4ff41ac6e1d6cc8fdab8849a065300887c9e5c5632b4f61d"} Apr 22 19:28:32.806036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.805993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmxtx" event={"ID":"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78","Type":"ContainerStarted","Data":"625d1e6e66c953c36dc4d9b7bebff681606bdc60f3bd27fa80352f30f1800e39"} Apr 22 19:28:32.806036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.806046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmxtx" event={"ID":"7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78","Type":"ContainerStarted","Data":"b2907e240ad970a4c0e7e7c9a702e37afb91f26f2f6ec83f8f8c0407ce423831"} Apr 22 19:28:32.806591 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.806118 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:32.807339 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.807306 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgpcw" event={"ID":"99054ff8-b2bf-49da-9d88-9f03b317fea0","Type":"ContainerStarted","Data":"f9d91b0bcd34dce2a89a83f6508adff9f03d009bc35aaa2dfd8a8a2ff548493f"} Apr 22 19:28:32.826916 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.826867 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mmxtx" podStartSLOduration=252.202556543 podStartE2EDuration="4m13.826855133s" podCreationTimestamp="2026-04-22 19:24:19 +0000 UTC" firstStartedPulling="2026-04-22 19:28:30.236264011 +0000 UTC m=+282.884278795" lastFinishedPulling="2026-04-22 19:28:31.860562604 +0000 UTC m=+284.508577385" observedRunningTime="2026-04-22 19:28:32.826209642 +0000 UTC m=+285.474224445" watchObservedRunningTime="2026-04-22 19:28:32.826855133 +0000 UTC m=+285.474869937" Apr 22 19:28:32.844172 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:32.844082 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zgpcw" podStartSLOduration=251.970818192 podStartE2EDuration="4m13.8440636s" podCreationTimestamp="2026-04-22 19:24:19 +0000 UTC" firstStartedPulling="2026-04-22 19:28:29.986920524 +0000 UTC m=+282.634935306" lastFinishedPulling="2026-04-22 19:28:31.860165931 +0000 UTC m=+284.508180714" observedRunningTime="2026-04-22 19:28:32.842937714 +0000 UTC m=+285.490952538" watchObservedRunningTime="2026-04-22 19:28:32.8440636 +0000 UTC m=+285.492078404" Apr 22 19:28:42.811882 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:42.811849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mmxtx" Apr 22 19:28:47.853496 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:47.853458 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:28:47.854166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:47.853654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:28:47.859805 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:28:47.859784 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:30:44.663296 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.663258 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896"] Apr 22 19:30:44.666441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.666424 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.669433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.669409 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:30:44.669545 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.669410 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:30:44.670458 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.670437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:30:44.675291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.675268 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896"] Apr 22 19:30:44.760037 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.759999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.760253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.760053 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.760253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.760169 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94knw\" (UniqueName: \"kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.860581 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.860527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.860581 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.860585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94knw\" (UniqueName: \"kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.860800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.860622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.860989 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.860964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.861069 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.860980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.869229 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.869196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94knw\" (UniqueName: \"kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:44.976084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:44.976038 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:30:45.100967 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:45.100931 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896"] Apr 22 19:30:45.104115 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:30:45.104070 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbcc846d_59d7_4a55_98e3_3cfaa25c6ca2.slice/crio-7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab WatchSource:0}: Error finding container 7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab: Status 404 returned error can't find the container with id 7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab Apr 22 19:30:45.105824 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:45.105809 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:30:45.150584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:45.150552 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerStarted","Data":"7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab"} Apr 22 19:30:51.168076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:51.168037 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerID="c02fec4ebf5db08c9d91b1d9963dc9ffd78c47eb8275c2eb576dd29f64b2e14c" exitCode=0 Apr 22 19:30:51.168560 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:51.168122 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerDied","Data":"c02fec4ebf5db08c9d91b1d9963dc9ffd78c47eb8275c2eb576dd29f64b2e14c"} Apr 22 19:30:53.175640 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:53.175595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerStarted","Data":"9a215091ef8a5c8093310af97a0745169da325535331cf6289a1af9862df1c99"} Apr 22 19:30:54.179583 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:54.179551 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerID="9a215091ef8a5c8093310af97a0745169da325535331cf6289a1af9862df1c99" exitCode=0 Apr 22 19:30:54.179962 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:30:54.179638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerDied","Data":"9a215091ef8a5c8093310af97a0745169da325535331cf6289a1af9862df1c99"} Apr 22 19:31:00.199023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:00.198990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerStarted","Data":"3aae565d54d8ebfa37e93ead0fab967dd2e360455406963a3e360ee0e6b13498"} Apr 22 19:31:00.221023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:00.220977 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" podStartSLOduration=1.275220719 podStartE2EDuration="16.220962744s" podCreationTimestamp="2026-04-22 19:30:44 +0000 UTC" firstStartedPulling="2026-04-22 19:30:45.105936812 +0000 UTC m=+417.753951594" lastFinishedPulling="2026-04-22 19:31:00.051678823 +0000 UTC m=+432.699693619" observedRunningTime="2026-04-22 19:31:00.219732629 +0000 UTC m=+432.867747454" watchObservedRunningTime="2026-04-22 19:31:00.220962744 +0000 UTC m=+432.868977547" Apr 22 19:31:01.208151 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:01.208088 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerID="3aae565d54d8ebfa37e93ead0fab967dd2e360455406963a3e360ee0e6b13498" exitCode=0 Apr 22 19:31:01.208591 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:01.208192 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerDied","Data":"3aae565d54d8ebfa37e93ead0fab967dd2e360455406963a3e360ee0e6b13498"} Apr 22 19:31:02.333350 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.333324 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:31:02.504889 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.504789 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94knw\" (UniqueName: \"kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw\") pod \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " Apr 22 19:31:02.504889 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.504841 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle\") pod \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " Apr 22 19:31:02.504889 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.504871 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util\") pod \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\" (UID: \"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2\") " Apr 22 19:31:02.505466 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.505441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle" (OuterVolumeSpecName: "bundle") pod "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" (UID: "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:31:02.507287 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.507258 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw" (OuterVolumeSpecName: "kube-api-access-94knw") pod "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" (UID: "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2"). InnerVolumeSpecName "kube-api-access-94knw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:31:02.509144 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.509116 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util" (OuterVolumeSpecName: "util") pod "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" (UID: "bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:31:02.605823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.605781 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94knw\" (UniqueName: \"kubernetes.io/projected/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-kube-api-access-94knw\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:31:02.605823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.605816 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:31:02.605823 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:02.605828 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:31:03.214080 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:03.214044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" event={"ID":"bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2","Type":"ContainerDied","Data":"7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab"} Apr 22 19:31:03.214080 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:03.214081 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b07e400256b4856d7cf0ef6ace36df35eff8ac2cfa7d26c612a318929f810ab" Apr 22 19:31:03.214317 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:03.214118 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cn4896" Apr 22 19:31:11.207822 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.207784 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-x6xk8"] Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208018 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="util" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208029 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="util" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208039 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="pull" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208044 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="pull" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208053 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="extract" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208059 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="extract" Apr 22 19:31:11.208226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.208119 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbcc846d-59d7-4a55-98e3-3cfaa25c6ca2" containerName="extract" Apr 22 19:31:11.254590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.254539 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-x6xk8"] Apr 22 19:31:11.254784 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.254719 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.257543 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.257515 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 19:31:11.257687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.257515 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 19:31:11.257687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.257585 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-hjv9s\"" Apr 22 19:31:11.257687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.257662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 19:31:11.257687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.257672 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 19:31:11.258706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.258687 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 19:31:11.264265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.264237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxqq\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-kube-api-access-zcxqq\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.264390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.264293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/40b86f04-3e4b-403c-a679-80007cdca322-cabundle0\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.264390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.264362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.365578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.365545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.365578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.365584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxqq\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-kube-api-access-zcxqq\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.365613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/40b86f04-3e4b-403c-a679-80007cdca322-cabundle0\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.365696 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.365720 2569 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.365729 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.365744 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-x6xk8: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 19:31:11.365860 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.365807 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates podName:40b86f04-3e4b-403c-a679-80007cdca322 nodeName:}" failed. No retries permitted until 2026-04-22 19:31:11.865788076 +0000 UTC m=+444.513802858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates") pod "keda-operator-ffbb595cb-x6xk8" (UID: "40b86f04-3e4b-403c-a679-80007cdca322") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 19:31:11.366200 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.366183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/40b86f04-3e4b-403c-a679-80007cdca322-cabundle0\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.375266 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.375236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxqq\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-kube-api-access-zcxqq\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.529835 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.529749 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr"] Apr 22 19:31:11.555935 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.555894 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr"] Apr 22 19:31:11.556141 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.556048 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.558922 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.558902 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 19:31:11.567877 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.567852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a821ac81-c49c-4339-a8a5-f33c005d3e6e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.568001 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.567894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.568001 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.567918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8r46\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-kube-api-access-t8r46\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.668693 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.668649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8r46\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-kube-api-access-t8r46\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.668879 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.668737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a821ac81-c49c-4339-a8a5-f33c005d3e6e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.668879 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.668779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.668952 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.668878 2569 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:31:11.668952 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.668890 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:31:11.668952 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.668908 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr: references non-existent secret key: tls.crt Apr 22 19:31:11.669048 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.668961 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates podName:a821ac81-c49c-4339-a8a5-f33c005d3e6e nodeName:}" failed. No retries permitted until 2026-04-22 19:31:12.168942909 +0000 UTC m=+444.816957693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates") pod "keda-metrics-apiserver-7c9f485588-tqsbr" (UID: "a821ac81-c49c-4339-a8a5-f33c005d3e6e") : references non-existent secret key: tls.crt Apr 22 19:31:11.669248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.669224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a821ac81-c49c-4339-a8a5-f33c005d3e6e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.678082 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.678053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8r46\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-kube-api-access-t8r46\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:11.870238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.870130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:11.870434 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.870296 2569 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:31:11.870434 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.870319 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:31:11.870434 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.870335 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-x6xk8: references non-existent secret key: ca.crt Apr 22 19:31:11.870434 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:11.870387 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates podName:40b86f04-3e4b-403c-a679-80007cdca322 nodeName:}" failed. No retries permitted until 2026-04-22 19:31:12.870372278 +0000 UTC m=+445.518387060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates") pod "keda-operator-ffbb595cb-x6xk8" (UID: "40b86f04-3e4b-403c-a679-80007cdca322") : references non-existent secret key: ca.crt Apr 22 19:31:11.928396 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.928341 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-77x4p"] Apr 22 19:31:11.948256 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.948224 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-77x4p"] Apr 22 19:31:11.948434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.948380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:11.951001 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.950977 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 19:31:11.971224 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.971184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-certificates\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:11.971402 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:11.971293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z979\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-kube-api-access-7z979\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.071869 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.071830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z979\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-kube-api-access-7z979\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.072067 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.071911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-certificates\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.074641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.074611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-certificates\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.081166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.081139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z979\" (UniqueName: \"kubernetes.io/projected/ff15c36c-1db9-4a63-9810-d88c0c983ae7-kube-api-access-7z979\") pod \"keda-admission-cf49989db-77x4p\" (UID: \"ff15c36c-1db9-4a63-9810-d88c0c983ae7\") " pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.172836 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.172736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:12.172996 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.172870 2569 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:31:12.172996 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.172889 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:31:12.172996 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.172910 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr: references non-existent secret key: tls.crt Apr 22 19:31:12.172996 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.172959 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates podName:a821ac81-c49c-4339-a8a5-f33c005d3e6e nodeName:}" failed. No retries permitted until 2026-04-22 19:31:13.172946121 +0000 UTC m=+445.820960902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates") pod "keda-metrics-apiserver-7c9f485588-tqsbr" (UID: "a821ac81-c49c-4339-a8a5-f33c005d3e6e") : references non-existent secret key: tls.crt Apr 22 19:31:12.259062 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.259026 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:12.395720 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.395686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-77x4p"] Apr 22 19:31:12.399186 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:31:12.399154 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff15c36c_1db9_4a63_9810_d88c0c983ae7.slice/crio-ca5f855a4210512fdc2a192abb731f062c52c1927b36ae375e69d41c5ade15e6 WatchSource:0}: Error finding container ca5f855a4210512fdc2a192abb731f062c52c1927b36ae375e69d41c5ade15e6: Status 404 returned error can't find the container with id ca5f855a4210512fdc2a192abb731f062c52c1927b36ae375e69d41c5ade15e6 Apr 22 19:31:12.877561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:12.877526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:12.877780 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.877672 2569 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:31:12.877780 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.877693 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:31:12.877780 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.877703 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-x6xk8: references non-existent secret key: ca.crt Apr 22 19:31:12.877780 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:12.877761 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates podName:40b86f04-3e4b-403c-a679-80007cdca322 nodeName:}" failed. No retries permitted until 2026-04-22 19:31:14.87774376 +0000 UTC m=+447.525758543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates") pod "keda-operator-ffbb595cb-x6xk8" (UID: "40b86f04-3e4b-403c-a679-80007cdca322") : references non-existent secret key: ca.crt Apr 22 19:31:13.179895 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:13.179799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:13.180080 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:13.179969 2569 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:31:13.180080 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:13.179990 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:31:13.180080 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:13.180012 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr: references non-existent secret key: tls.crt Apr 22 19:31:13.180080 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:13.180081 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates podName:a821ac81-c49c-4339-a8a5-f33c005d3e6e nodeName:}" failed. No retries permitted until 2026-04-22 19:31:15.180061721 +0000 UTC m=+447.828076518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates") pod "keda-metrics-apiserver-7c9f485588-tqsbr" (UID: "a821ac81-c49c-4339-a8a5-f33c005d3e6e") : references non-existent secret key: tls.crt Apr 22 19:31:13.244922 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:13.244879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-77x4p" event={"ID":"ff15c36c-1db9-4a63-9810-d88c0c983ae7","Type":"ContainerStarted","Data":"ca5f855a4210512fdc2a192abb731f062c52c1927b36ae375e69d41c5ade15e6"} Apr 22 19:31:14.249537 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:14.249494 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-77x4p" event={"ID":"ff15c36c-1db9-4a63-9810-d88c0c983ae7","Type":"ContainerStarted","Data":"654afda5c64812df1bba4b57107f0ba41f833cc871db7cf604cad5695e014d7c"} Apr 22 19:31:14.249947 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:14.249749 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:14.270889 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:14.270837 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-77x4p" podStartSLOduration=1.784876677 podStartE2EDuration="3.270821672s" podCreationTimestamp="2026-04-22 19:31:11 +0000 UTC" firstStartedPulling="2026-04-22 19:31:12.400940504 +0000 UTC m=+445.048955290" lastFinishedPulling="2026-04-22 19:31:13.8868855 +0000 UTC m=+446.534900285" observedRunningTime="2026-04-22 19:31:14.269404438 +0000 UTC m=+446.917419241" watchObservedRunningTime="2026-04-22 19:31:14.270821672 +0000 UTC m=+446.918836470" Apr 22 19:31:14.893418 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:14.893380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:14.893604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:14.893491 2569 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:31:14.893604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:14.893503 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:31:14.893604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:14.893513 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-x6xk8: references non-existent secret key: ca.crt Apr 22 19:31:14.893604 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:31:14.893563 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates podName:40b86f04-3e4b-403c-a679-80007cdca322 nodeName:}" failed. No retries permitted until 2026-04-22 19:31:18.893549128 +0000 UTC m=+451.541563910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates") pod "keda-operator-ffbb595cb-x6xk8" (UID: "40b86f04-3e4b-403c-a679-80007cdca322") : references non-existent secret key: ca.crt Apr 22 19:31:15.195918 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:15.195874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:15.198557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:15.198534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a821ac81-c49c-4339-a8a5-f33c005d3e6e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-tqsbr\" (UID: \"a821ac81-c49c-4339-a8a5-f33c005d3e6e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:15.466848 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:15.466755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:15.591345 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:15.591309 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr"] Apr 22 19:31:15.595289 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:31:15.595259 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda821ac81_c49c_4339_a8a5_f33c005d3e6e.slice/crio-8230af9f8470e63be4e106d357d47fe01a4c38a1f32ed24e01c79e096009dd49 WatchSource:0}: Error finding container 8230af9f8470e63be4e106d357d47fe01a4c38a1f32ed24e01c79e096009dd49: Status 404 returned error can't find the container with id 8230af9f8470e63be4e106d357d47fe01a4c38a1f32ed24e01c79e096009dd49 Apr 22 19:31:16.255731 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:16.255697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" event={"ID":"a821ac81-c49c-4339-a8a5-f33c005d3e6e","Type":"ContainerStarted","Data":"8230af9f8470e63be4e106d357d47fe01a4c38a1f32ed24e01c79e096009dd49"} Apr 22 19:31:18.263260 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:18.263216 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" event={"ID":"a821ac81-c49c-4339-a8a5-f33c005d3e6e","Type":"ContainerStarted","Data":"99573d4a4837dd2e1e817716475ca29c8b49f2b3823446b64df059a8def3a035"} Apr 22 19:31:18.263665 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:18.263338 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:18.284275 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:18.284222 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" podStartSLOduration=4.779573121 podStartE2EDuration="7.284205307s" podCreationTimestamp="2026-04-22 19:31:11 +0000 UTC" firstStartedPulling="2026-04-22 19:31:15.596666974 +0000 UTC m=+448.244681756" lastFinishedPulling="2026-04-22 19:31:18.101299154 +0000 UTC m=+450.749313942" observedRunningTime="2026-04-22 19:31:18.282371425 +0000 UTC m=+450.930386233" watchObservedRunningTime="2026-04-22 19:31:18.284205307 +0000 UTC m=+450.932220110" Apr 22 19:31:18.931427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:18.931387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:18.933917 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:18.933895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/40b86f04-3e4b-403c-a679-80007cdca322-certificates\") pod \"keda-operator-ffbb595cb-x6xk8\" (UID: \"40b86f04-3e4b-403c-a679-80007cdca322\") " pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:19.064800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:19.064765 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:19.190446 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:19.190353 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-x6xk8"] Apr 22 19:31:19.193274 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:31:19.193237 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b86f04_3e4b_403c_a679_80007cdca322.slice/crio-d37ec70d558138338d87df35db78ec3a3afef3bd1235e94ca261b6eaad560fbe WatchSource:0}: Error finding container d37ec70d558138338d87df35db78ec3a3afef3bd1235e94ca261b6eaad560fbe: Status 404 returned error can't find the container with id d37ec70d558138338d87df35db78ec3a3afef3bd1235e94ca261b6eaad560fbe Apr 22 19:31:19.266960 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:19.266925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" event={"ID":"40b86f04-3e4b-403c-a679-80007cdca322","Type":"ContainerStarted","Data":"d37ec70d558138338d87df35db78ec3a3afef3bd1235e94ca261b6eaad560fbe"} Apr 22 19:31:22.277664 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:22.277622 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" event={"ID":"40b86f04-3e4b-403c-a679-80007cdca322","Type":"ContainerStarted","Data":"bef47e83a8f22ff7ea250cd17557e5a9e82b8646414ca135a354cea88a5f01e7"} Apr 22 19:31:22.278121 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:22.277741 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:31:22.296279 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:22.296218 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" podStartSLOduration=8.314244401 podStartE2EDuration="11.296200756s" podCreationTimestamp="2026-04-22 19:31:11 +0000 UTC" firstStartedPulling="2026-04-22 19:31:19.194548394 +0000 UTC m=+451.842563177" lastFinishedPulling="2026-04-22 19:31:22.176504751 +0000 UTC m=+454.824519532" observedRunningTime="2026-04-22 19:31:22.295296874 +0000 UTC m=+454.943311699" watchObservedRunningTime="2026-04-22 19:31:22.296200756 +0000 UTC m=+454.944215561" Apr 22 19:31:29.271740 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:29.271697 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-tqsbr" Apr 22 19:31:35.254534 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:35.254501 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-77x4p" Apr 22 19:31:43.282565 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:31:43.282531 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-x6xk8" Apr 22 19:32:04.686966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.686922 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt"] Apr 22 19:32:04.696002 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.695976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.698831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.698805 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:32:04.698975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.698803 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:32:04.700030 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.700001 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:32:04.702376 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.702352 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt"] Apr 22 19:32:04.774637 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.774592 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.774831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.774642 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r8bw\" (UniqueName: \"kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.774831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.774727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.876051 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.876008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.876265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.876074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.876265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.876130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r8bw\" (UniqueName: \"kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.876501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.876482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.876548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.876529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:04.885362 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:04.885328 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r8bw\" (UniqueName: \"kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:05.006737 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:05.006698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:05.148529 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:05.148504 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt"] Apr 22 19:32:05.151318 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:32:05.151289 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3335442_0841_49b2_bd64_1c7690530d4a.slice/crio-064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4 WatchSource:0}: Error finding container 064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4: Status 404 returned error can't find the container with id 064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4 Apr 22 19:32:05.401816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:05.401731 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3335442-0841-49b2-bd64-1c7690530d4a" containerID="ecfd5fed8fe8176981d0ac90310757c3c1db474581dc6da7e2bfacc8a1e552d6" exitCode=0 Apr 22 19:32:05.401816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:05.401781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" event={"ID":"d3335442-0841-49b2-bd64-1c7690530d4a","Type":"ContainerDied","Data":"ecfd5fed8fe8176981d0ac90310757c3c1db474581dc6da7e2bfacc8a1e552d6"} Apr 22 19:32:05.401816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:05.401804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" event={"ID":"d3335442-0841-49b2-bd64-1c7690530d4a","Type":"ContainerStarted","Data":"064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4"} Apr 22 19:32:08.413279 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:08.413246 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3335442-0841-49b2-bd64-1c7690530d4a" containerID="3ddc9b3b97c5cf736201caca5ea2856a7d21fe9301a908377d01560369cf5e01" exitCode=0 Apr 22 19:32:08.413670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:08.413313 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" event={"ID":"d3335442-0841-49b2-bd64-1c7690530d4a","Type":"ContainerDied","Data":"3ddc9b3b97c5cf736201caca5ea2856a7d21fe9301a908377d01560369cf5e01"} Apr 22 19:32:09.417992 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:09.417948 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3335442-0841-49b2-bd64-1c7690530d4a" containerID="51dbcb8bdd2e3564e7e2ac7dc8208bf7eb464d2c05fb6b324e68ce9f035c3340" exitCode=0 Apr 22 19:32:09.418433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:09.418032 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" event={"ID":"d3335442-0841-49b2-bd64-1c7690530d4a","Type":"ContainerDied","Data":"51dbcb8bdd2e3564e7e2ac7dc8208bf7eb464d2c05fb6b324e68ce9f035c3340"} Apr 22 19:32:10.554725 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.554702 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:10.721493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.721461 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r8bw\" (UniqueName: \"kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw\") pod \"d3335442-0841-49b2-bd64-1c7690530d4a\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " Apr 22 19:32:10.721670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.721526 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle\") pod \"d3335442-0841-49b2-bd64-1c7690530d4a\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " Apr 22 19:32:10.721670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.721592 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util\") pod \"d3335442-0841-49b2-bd64-1c7690530d4a\" (UID: \"d3335442-0841-49b2-bd64-1c7690530d4a\") " Apr 22 19:32:10.722303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.722280 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle" (OuterVolumeSpecName: "bundle") pod "d3335442-0841-49b2-bd64-1c7690530d4a" (UID: "d3335442-0841-49b2-bd64-1c7690530d4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:32:10.723776 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.723753 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw" (OuterVolumeSpecName: "kube-api-access-8r8bw") pod "d3335442-0841-49b2-bd64-1c7690530d4a" (UID: "d3335442-0841-49b2-bd64-1c7690530d4a"). InnerVolumeSpecName "kube-api-access-8r8bw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:32:10.776603 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.776538 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util" (OuterVolumeSpecName: "util") pod "d3335442-0841-49b2-bd64-1c7690530d4a" (UID: "d3335442-0841-49b2-bd64-1c7690530d4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:32:10.822514 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.822477 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:10.822514 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.822512 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r8bw\" (UniqueName: \"kubernetes.io/projected/d3335442-0841-49b2-bd64-1c7690530d4a-kube-api-access-8r8bw\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:10.822673 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:10.822523 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3335442-0841-49b2-bd64-1c7690530d4a-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:11.427735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:11.427701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" event={"ID":"d3335442-0841-49b2-bd64-1c7690530d4a","Type":"ContainerDied","Data":"064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4"} Apr 22 19:32:11.427735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:11.427733 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064bd7698787b56dd2ce80b77c8f391c764a490f32fa77a398e3909b4c6364f4" Apr 22 19:32:11.427943 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:11.427751 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6nplt" Apr 22 19:32:26.838107 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838052 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp"] Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838351 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="pull" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838364 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="pull" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838372 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="extract" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838377 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="extract" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838388 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="util" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838396 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="util" Apr 22 19:32:26.838491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.838439 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3335442-0841-49b2-bd64-1c7690530d4a" containerName="extract" Apr 22 19:32:26.842742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.842723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:26.845439 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.845420 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:32:26.846588 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.846552 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:32:26.846694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.846590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:32:26.849307 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.849283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp"] Apr 22 19:32:26.940372 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.940333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:26.940567 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.940404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:26.940567 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:26.940423 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbhx\" (UniqueName: \"kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.041036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.040997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.041036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.041039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbhx\" (UniqueName: \"kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.041290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.041074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.041495 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.041480 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.041531 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.041479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.049300 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.049276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbhx\" (UniqueName: \"kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.152475 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.152391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:27.275443 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.275411 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp"] Apr 22 19:32:27.278601 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:32:27.278571 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafa4ed87_b58d_4ad6_bda2_3b1501a5c7df.slice/crio-263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f WatchSource:0}: Error finding container 263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f: Status 404 returned error can't find the container with id 263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f Apr 22 19:32:27.474108 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.474052 2569 generic.go:358] "Generic (PLEG): container finished" podID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerID="a81c5aad2e3727f6743f683ecdb6458313064c547e23882ea3c31636c542b938" exitCode=0 Apr 22 19:32:27.474285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.474144 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" event={"ID":"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df","Type":"ContainerDied","Data":"a81c5aad2e3727f6743f683ecdb6458313064c547e23882ea3c31636c542b938"} Apr 22 19:32:27.474285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:27.474175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" event={"ID":"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df","Type":"ContainerStarted","Data":"263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f"} Apr 22 19:32:30.483857 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:30.483822 2569 generic.go:358] "Generic (PLEG): container finished" podID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerID="18eeb04e87d0ab2602699ce40e57a359bbde2534069ec5e883785b7c3d715a9f" exitCode=0 Apr 22 19:32:30.484258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:30.483907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" event={"ID":"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df","Type":"ContainerDied","Data":"18eeb04e87d0ab2602699ce40e57a359bbde2534069ec5e883785b7c3d715a9f"} Apr 22 19:32:31.488989 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:31.488952 2569 generic.go:358] "Generic (PLEG): container finished" podID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerID="d0bd47e3bcab8e7b55e9b4ff541d1d2d0c934979db7331da956ead85195665ae" exitCode=0 Apr 22 19:32:31.489390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:31.489039 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" event={"ID":"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df","Type":"ContainerDied","Data":"d0bd47e3bcab8e7b55e9b4ff541d1d2d0c934979db7331da956ead85195665ae"} Apr 22 19:32:32.611750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.611730 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:32.683113 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.683058 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util\") pod \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " Apr 22 19:32:32.683332 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.683143 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbhx\" (UniqueName: \"kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx\") pod \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " Apr 22 19:32:32.683332 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.683171 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle\") pod \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\" (UID: \"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df\") " Apr 22 19:32:32.683593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.683569 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle" (OuterVolumeSpecName: "bundle") pod "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" (UID: "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:32:32.685543 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.685519 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx" (OuterVolumeSpecName: "kube-api-access-rfbhx") pod "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" (UID: "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df"). InnerVolumeSpecName "kube-api-access-rfbhx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:32:32.784448 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.784362 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rfbhx\" (UniqueName: \"kubernetes.io/projected/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-kube-api-access-rfbhx\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:32.784448 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.784396 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:32.827247 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.827193 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util" (OuterVolumeSpecName: "util") pod "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" (UID: "afa4ed87-b58d-4ad6-bda2-3b1501a5c7df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:32:32.885499 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:32.885464 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/afa4ed87-b58d-4ad6-bda2-3b1501a5c7df-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:32:33.495547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:33.495516 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" event={"ID":"afa4ed87-b58d-4ad6-bda2-3b1501a5c7df","Type":"ContainerDied","Data":"263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f"} Apr 22 19:32:33.495547 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:33.495546 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="263780b199fd4255f4fc89490343016d7d1334341f5251b54fbc830fa7f94e3f" Apr 22 19:32:33.495755 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:33.495571 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fng5lp" Apr 22 19:32:56.256624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256547 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r"] Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256811 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="util" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256821 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="util" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256829 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="extract" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256835 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="extract" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256854 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="pull" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256859 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="pull" Apr 22 19:32:56.257076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.256902 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="afa4ed87-b58d-4ad6-bda2-3b1501a5c7df" containerName="extract" Apr 22 19:32:56.266256 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.266226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.267316 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.267288 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r"] Apr 22 19:32:56.268962 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.268940 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:32:56.269080 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.268974 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:32:56.270176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.270158 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:32:56.360004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.359963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjg99\" (UniqueName: \"kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.360004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.360005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.360241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.360036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.461372 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.461336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjg99\" (UniqueName: \"kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.461442 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.461380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.461442 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.461425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.461828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.461810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.461866 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.461826 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.469834 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.469805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjg99\" (UniqueName: \"kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.576072 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.575989 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:32:56.698038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:56.698005 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r"] Apr 22 19:32:56.701132 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:32:56.701091 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod693eca2d_910d_4c10_9408_362655fc9be7.slice/crio-c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71 WatchSource:0}: Error finding container c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71: Status 404 returned error can't find the container with id c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71 Apr 22 19:32:57.561667 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:57.561629 2569 generic.go:358] "Generic (PLEG): container finished" podID="693eca2d-910d-4c10-9408-362655fc9be7" containerID="1fa3b5951577662d73817f275e8c6debecf86e9aace7b6f64d327ec5d7538b1a" exitCode=0 Apr 22 19:32:57.562118 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:57.561671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" event={"ID":"693eca2d-910d-4c10-9408-362655fc9be7","Type":"ContainerDied","Data":"1fa3b5951577662d73817f275e8c6debecf86e9aace7b6f64d327ec5d7538b1a"} Apr 22 19:32:57.562118 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:57.561700 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" event={"ID":"693eca2d-910d-4c10-9408-362655fc9be7","Type":"ContainerStarted","Data":"c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71"} Apr 22 19:32:58.566671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:58.566638 2569 generic.go:358] "Generic (PLEG): container finished" podID="693eca2d-910d-4c10-9408-362655fc9be7" containerID="60f2033785971a8ed4e7863e50f75b58635ef9047ddae5fe8ab82c3b4e86a5b6" exitCode=0 Apr 22 19:32:58.567033 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:58.566684 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" event={"ID":"693eca2d-910d-4c10-9408-362655fc9be7","Type":"ContainerDied","Data":"60f2033785971a8ed4e7863e50f75b58635ef9047ddae5fe8ab82c3b4e86a5b6"} Apr 22 19:32:59.571433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:59.571401 2569 generic.go:358] "Generic (PLEG): container finished" podID="693eca2d-910d-4c10-9408-362655fc9be7" containerID="4c457ee5538f9eadcc4a73fedb38d22e769fa80b5b87c86382558da6b7b85461" exitCode=0 Apr 22 19:32:59.571803 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:32:59.571447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" event={"ID":"693eca2d-910d-4c10-9408-362655fc9be7","Type":"ContainerDied","Data":"4c457ee5538f9eadcc4a73fedb38d22e769fa80b5b87c86382558da6b7b85461"} Apr 22 19:33:00.696561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.696538 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:33:00.798984 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.798940 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjg99\" (UniqueName: \"kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99\") pod \"693eca2d-910d-4c10-9408-362655fc9be7\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " Apr 22 19:33:00.798984 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.798980 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util\") pod \"693eca2d-910d-4c10-9408-362655fc9be7\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " Apr 22 19:33:00.799217 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.799041 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle\") pod \"693eca2d-910d-4c10-9408-362655fc9be7\" (UID: \"693eca2d-910d-4c10-9408-362655fc9be7\") " Apr 22 19:33:00.800010 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.799980 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle" (OuterVolumeSpecName: "bundle") pod "693eca2d-910d-4c10-9408-362655fc9be7" (UID: "693eca2d-910d-4c10-9408-362655fc9be7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:00.801349 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.801328 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99" (OuterVolumeSpecName: "kube-api-access-zjg99") pod "693eca2d-910d-4c10-9408-362655fc9be7" (UID: "693eca2d-910d-4c10-9408-362655fc9be7"). InnerVolumeSpecName "kube-api-access-zjg99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:00.807368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.807343 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util" (OuterVolumeSpecName: "util") pod "693eca2d-910d-4c10-9408-362655fc9be7" (UID: "693eca2d-910d-4c10-9408-362655fc9be7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:00.900085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.899982 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjg99\" (UniqueName: \"kubernetes.io/projected/693eca2d-910d-4c10-9408-362655fc9be7-kube-api-access-zjg99\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:00.900085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.900025 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:00.900085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:00.900039 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/693eca2d-910d-4c10-9408-362655fc9be7-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:01.578403 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:01.578363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" event={"ID":"693eca2d-910d-4c10-9408-362655fc9be7","Type":"ContainerDied","Data":"c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71"} Apr 22 19:33:01.578403 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:01.578398 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4a0283b0dc7514b64a39b452206a5e1b0a5225b9ec067f9630462b3ea86da71" Apr 22 19:33:01.578686 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:01.578474 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835pg46r" Apr 22 19:33:06.417561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417525 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl"] Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417772 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="pull" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417784 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="pull" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417800 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="extract" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417805 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="extract" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417814 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="util" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417819 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="util" Apr 22 19:33:06.417925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.417863 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="693eca2d-910d-4c10-9408-362655fc9be7" containerName="extract" Apr 22 19:33:06.420666 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.420649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.423546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.423524 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:33:06.423682 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.423526 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:33:06.424615 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.424595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:33:06.433633 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.433607 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl"] Apr 22 19:33:06.544993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.544953 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnv72\" (UniqueName: \"kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.544993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.545000 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.545244 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.545030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.646342 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.646292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.646342 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.646344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.646611 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.646430 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnv72\" (UniqueName: \"kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.646699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.646679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.646760 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.646709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.666250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.666210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnv72\" (UniqueName: \"kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.729736 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.729690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:06.860307 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:06.860274 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl"] Apr 22 19:33:06.863677 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:06.863645 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb8f40ac_bea7_4ad9_a8b9_977dd48dd29c.slice/crio-c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096 WatchSource:0}: Error finding container c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096: Status 404 returned error can't find the container with id c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096 Apr 22 19:33:07.597166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.597130 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerID="f480367a8fac641bbfeee1cfe1a5154784505e40efaf018ba3f1af5b3ef6c65e" exitCode=0 Apr 22 19:33:07.597538 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.597192 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" event={"ID":"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c","Type":"ContainerDied","Data":"f480367a8fac641bbfeee1cfe1a5154784505e40efaf018ba3f1af5b3ef6c65e"} Apr 22 19:33:07.597538 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.597214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" event={"ID":"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c","Type":"ContainerStarted","Data":"c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096"} Apr 22 19:33:07.612825 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.612794 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hztcd"] Apr 22 19:33:07.615936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.615921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.618952 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.618775 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 19:33:07.619262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.619229 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 19:33:07.619490 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.619451 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-vj2pl\"" Apr 22 19:33:07.628291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.628267 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hztcd"] Apr 22 19:33:07.754726 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.754688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.754897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.754762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28zp\" (UniqueName: \"kubernetes.io/projected/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-kube-api-access-r28zp\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.855975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.855874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r28zp\" (UniqueName: \"kubernetes.io/projected/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-kube-api-access-r28zp\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.855975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.855919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.858571 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.858544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.865229 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.865206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28zp\" (UniqueName: \"kubernetes.io/projected/12c11dff-6e1c-4d18-bbf1-1ee913fe0186-kube-api-access-r28zp\") pod \"servicemesh-operator3-55f49c5f94-hztcd\" (UID: \"12c11dff-6e1c-4d18-bbf1-1ee913fe0186\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:07.925230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:07.925192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:08.068869 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:08.068839 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hztcd"] Apr 22 19:33:08.071317 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:08.071272 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c11dff_6e1c_4d18_bbf1_1ee913fe0186.slice/crio-4dc0fd4e8bb3daada3087d8dcaca3e99987eda925a83b64998b322cdf39d6e5d WatchSource:0}: Error finding container 4dc0fd4e8bb3daada3087d8dcaca3e99987eda925a83b64998b322cdf39d6e5d: Status 404 returned error can't find the container with id 4dc0fd4e8bb3daada3087d8dcaca3e99987eda925a83b64998b322cdf39d6e5d Apr 22 19:33:08.602123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:08.602069 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" event={"ID":"12c11dff-6e1c-4d18-bbf1-1ee913fe0186","Type":"ContainerStarted","Data":"4dc0fd4e8bb3daada3087d8dcaca3e99987eda925a83b64998b322cdf39d6e5d"} Apr 22 19:33:10.610612 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:10.610579 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerID="b148f5349f5cdf15f22648346a03848833df360c4e06a01e5ee18d53d063f5b9" exitCode=0 Apr 22 19:33:10.610612 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:10.610625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" event={"ID":"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c","Type":"ContainerDied","Data":"b148f5349f5cdf15f22648346a03848833df360c4e06a01e5ee18d53d063f5b9"} Apr 22 19:33:11.615270 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:11.615235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" event={"ID":"12c11dff-6e1c-4d18-bbf1-1ee913fe0186","Type":"ContainerStarted","Data":"0809ef7f4511ba8f994a131c50228706eb550bf140a94461383fbb632f88599e"} Apr 22 19:33:11.615765 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:11.615325 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:11.617076 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:11.617048 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerID="58ecac7d470de9c5fda282253b2ff8dbab42cd59a0d6835fef750009caa38325" exitCode=0 Apr 22 19:33:11.617219 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:11.617131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" event={"ID":"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c","Type":"ContainerDied","Data":"58ecac7d470de9c5fda282253b2ff8dbab42cd59a0d6835fef750009caa38325"} Apr 22 19:33:11.637168 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:11.637112 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" podStartSLOduration=2.017648678 podStartE2EDuration="4.637079825s" podCreationTimestamp="2026-04-22 19:33:07 +0000 UTC" firstStartedPulling="2026-04-22 19:33:08.074571247 +0000 UTC m=+560.722586028" lastFinishedPulling="2026-04-22 19:33:10.694002385 +0000 UTC m=+563.342017175" observedRunningTime="2026-04-22 19:33:11.635490413 +0000 UTC m=+564.283505218" watchObservedRunningTime="2026-04-22 19:33:11.637079825 +0000 UTC m=+564.285094628" Apr 22 19:33:12.742152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.742130 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:12.900084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.899991 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util\") pod \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " Apr 22 19:33:12.900084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.900068 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle\") pod \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " Apr 22 19:33:12.900313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.900150 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnv72\" (UniqueName: \"kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72\") pod \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\" (UID: \"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c\") " Apr 22 19:33:12.901207 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.901179 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle" (OuterVolumeSpecName: "bundle") pod "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" (UID: "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:12.902318 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.902286 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72" (OuterVolumeSpecName: "kube-api-access-cnv72") pod "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" (UID: "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c"). InnerVolumeSpecName "kube-api-access-cnv72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:12.904900 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:12.904879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util" (OuterVolumeSpecName: "util") pod "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" (UID: "cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:13.001200 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.001159 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:13.001200 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.001192 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:13.001402 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.001206 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnv72\" (UniqueName: \"kubernetes.io/projected/cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c-kube-api-access-cnv72\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:13.626039 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.626000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" event={"ID":"cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c","Type":"ContainerDied","Data":"c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096"} Apr 22 19:33:13.626039 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.626040 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c51e2374af4c33d066a2270633c4216dd5c668d4e779be31d8f32b4c1dab1096" Apr 22 19:33:13.626271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:13.626054 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebhq2sl" Apr 22 19:33:22.623368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:22.623334 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hztcd" Apr 22 19:33:32.204526 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204491 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204812 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="util" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204823 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="util" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204839 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="pull" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204844 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="pull" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204857 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="extract" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204862 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="extract" Apr 22 19:33:32.205004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.204911 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb8f40ac-bea7-4ad9-a8b9-977dd48dd29c" containerName="extract" Apr 22 19:33:32.211444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.211420 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.214049 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214026 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 19:33:32.214201 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214026 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 19:33:32.214201 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-mf4cd\"" Apr 22 19:33:32.214327 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214213 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 19:33:32.214327 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214228 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:33:32.214327 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214240 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 19:33:32.214484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.214345 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:33:32.218854 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.218675 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:33:32.336028 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.335993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336028 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvmck\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.336286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.336258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmck\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.437897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.437719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.438389 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.438365 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.440072 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.440054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.440356 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.440330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.440601 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.440583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.440676 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.440627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.445816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.445794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.445925 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.445902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmck\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck\") pod \"istiod-openshift-gateway-7cd77c7ffd-s5lhg\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.521985 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.521889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:32.657779 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.657753 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:33:32.660645 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:32.660617 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda51228d5_4229_4181_a691_5eb6ead8841f.slice/crio-5482df703d2d3b7529e77ffd85f24589c095f29bc957247d88d81b2347483869 WatchSource:0}: Error finding container 5482df703d2d3b7529e77ffd85f24589c095f29bc957247d88d81b2347483869: Status 404 returned error can't find the container with id 5482df703d2d3b7529e77ffd85f24589c095f29bc957247d88d81b2347483869 Apr 22 19:33:32.688411 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:32.688369 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" event={"ID":"a51228d5-4229-4181-a691-5eb6ead8841f","Type":"ContainerStarted","Data":"5482df703d2d3b7529e77ffd85f24589c095f29bc957247d88d81b2347483869"} Apr 22 19:33:35.538602 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:35.538558 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:33:35.538891 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:35.538633 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:33:35.700539 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:35.700492 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" event={"ID":"a51228d5-4229-4181-a691-5eb6ead8841f","Type":"ContainerStarted","Data":"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6"} Apr 22 19:33:35.700723 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:35.700622 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:35.723047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:35.722984 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" podStartSLOduration=0.847920818 podStartE2EDuration="3.722964914s" podCreationTimestamp="2026-04-22 19:33:32 +0000 UTC" firstStartedPulling="2026-04-22 19:33:32.663313851 +0000 UTC m=+585.311328637" lastFinishedPulling="2026-04-22 19:33:35.538357951 +0000 UTC m=+588.186372733" observedRunningTime="2026-04-22 19:33:35.720345728 +0000 UTC m=+588.368360533" watchObservedRunningTime="2026-04-22 19:33:35.722964914 +0000 UTC m=+588.370979718" Apr 22 19:33:36.705992 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:36.705948 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-s5lhg container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 19:33:36.706399 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:36.706006 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:39.705919 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:39.705878 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:33:40.058842 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.058758 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2"] Apr 22 19:33:40.061296 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.061267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.064090 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.064062 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-jcpdv\"" Apr 22 19:33:40.072127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.071823 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2"] Apr 22 19:33:40.199500 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpm8\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-kube-api-access-zgpm8\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.199875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.199782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300382 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300382 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300616 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300408 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300616 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300616 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpm8\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-kube-api-access-zgpm8\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300930 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300855 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.300984 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.300949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.301275 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.301247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.301397 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.301373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.301623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.301604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.302878 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.302857 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.303383 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.303362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.311277 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.311220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpm8\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-kube-api-access-zgpm8\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.311666 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.311643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71df9112-0f4c-45a0-8daa-f289e7cccb4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8q9q2\" (UID: \"71df9112-0f4c-45a0-8daa-f289e7cccb4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.375943 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.375898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:40.514873 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.514836 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2"] Apr 22 19:33:40.517350 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:40.517325 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71df9112_0f4c_45a0_8daa_f289e7cccb4f.slice/crio-bc02ba0441828391e2c0a368d572776630b4d7244c1464fc424dc205b52416fe WatchSource:0}: Error finding container bc02ba0441828391e2c0a368d572776630b4d7244c1464fc424dc205b52416fe: Status 404 returned error can't find the container with id bc02ba0441828391e2c0a368d572776630b4d7244c1464fc424dc205b52416fe Apr 22 19:33:40.718091 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:40.718057 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" event={"ID":"71df9112-0f4c-45a0-8daa-f289e7cccb4f","Type":"ContainerStarted","Data":"bc02ba0441828391e2c0a368d572776630b4d7244c1464fc424dc205b52416fe"} Apr 22 19:33:47.032028 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.031990 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:33:47.032330 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.032061 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:33:47.032330 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.032114 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:33:47.744165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.744126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" event={"ID":"71df9112-0f4c-45a0-8daa-f289e7cccb4f","Type":"ContainerStarted","Data":"cf72d1da175545905b3e872d1b4b39c10777b0523ce95910bde096dfc8cb98c1"} Apr 22 19:33:47.768756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.768682 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" podStartSLOduration=1.256082977 podStartE2EDuration="7.76866894s" podCreationTimestamp="2026-04-22 19:33:40 +0000 UTC" firstStartedPulling="2026-04-22 19:33:40.519170382 +0000 UTC m=+593.167185164" lastFinishedPulling="2026-04-22 19:33:47.031756342 +0000 UTC m=+599.679771127" observedRunningTime="2026-04-22 19:33:47.766277806 +0000 UTC m=+600.414292624" watchObservedRunningTime="2026-04-22 19:33:47.76866894 +0000 UTC m=+600.416683743" Apr 22 19:33:47.881639 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.881603 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:33:47.882564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:47.882541 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:33:48.376430 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:48.376392 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:48.381131 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:48.381086 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:48.747525 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:48.747492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:48.748432 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:48.748411 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8q9q2" Apr 22 19:33:51.448147 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.448110 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj"] Apr 22 19:33:51.450551 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.450532 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.453341 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.453311 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:33:51.453453 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.453347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:33:51.454503 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.454483 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84tsk\"" Apr 22 19:33:51.459729 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.459704 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj"] Apr 22 19:33:51.549131 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.549077 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x"] Apr 22 19:33:51.551463 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.551445 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.560156 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.560129 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x"] Apr 22 19:33:51.592855 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.592823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.592855 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.592866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnvnc\" (UniqueName: \"kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.593057 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.592891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.647006 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.646968 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m"] Apr 22 19:33:51.649262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.649245 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.658467 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.658431 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m"] Apr 22 19:33:51.693917 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.693880 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.694088 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.693924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.694088 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.693951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qj4\" (UniqueName: \"kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.694088 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.694024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.694285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.694116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnvnc\" (UniqueName: \"kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.694285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.694171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.694385 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.694365 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.694476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.694458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.703565 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.703513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnvnc\" (UniqueName: \"kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.756979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.756918 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847"] Apr 22 19:33:51.760387 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.760358 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.760522 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.760504 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:51.773844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.773815 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847"] Apr 22 19:33:51.795206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.795206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhlc\" (UniqueName: \"kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.795454 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.795454 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795259 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22qj4\" (UniqueName: \"kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.795454 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.795454 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.795659 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.795659 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.795650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.804557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.804525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qj4\" (UniqueName: \"kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.861393 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.861338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:51.896493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896424 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhlc\" (UniqueName: \"kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.896493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.896845 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.896845 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.896845 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggcp\" (UniqueName: \"kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.896845 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.896645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.897368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.897055 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.897368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.897144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.900010 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.899956 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj"] Apr 22 19:33:51.903698 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:51.903665 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8ab3af_8dce_4deb_91c6_088d13208661.slice/crio-9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0 WatchSource:0}: Error finding container 9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0: Status 404 returned error can't find the container with id 9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0 Apr 22 19:33:51.905393 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.905366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhlc\" (UniqueName: \"kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.958173 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.958083 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:51.993591 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.993552 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x"] Apr 22 19:33:51.997224 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.997173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.997345 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.997222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nggcp\" (UniqueName: \"kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.997345 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.997272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.997604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.997583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.997654 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:51.997606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:51.997924 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:51.997891 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72478216_4930_46d9_bde5_0cc8c15d42fd.slice/crio-4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd WatchSource:0}: Error finding container 4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd: Status 404 returned error can't find the container with id 4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd Apr 22 19:33:52.008157 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.008132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggcp\" (UniqueName: \"kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:52.084124 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.084086 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:52.111844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.111808 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m"] Apr 22 19:33:52.115237 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:52.115208 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod530defbb_69c4_436d_9fbf_c85bea38c64e.slice/crio-3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd WatchSource:0}: Error finding container 3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd: Status 404 returned error can't find the container with id 3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd Apr 22 19:33:52.216919 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.216890 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847"] Apr 22 19:33:52.219228 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:33:52.219201 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c065f30_eb67_42aa_a23e_92556f94f4ad.slice/crio-27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6 WatchSource:0}: Error finding container 27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6: Status 404 returned error can't find the container with id 27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6 Apr 22 19:33:52.765690 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.765657 2569 generic.go:358] "Generic (PLEG): container finished" podID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerID="1c0dc8898870b8e70d5aa9fa33bf9e5715f2532619c75073cbf4fd230d28b2d5" exitCode=0 Apr 22 19:33:52.766158 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.765748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" event={"ID":"530defbb-69c4-436d-9fbf-c85bea38c64e","Type":"ContainerDied","Data":"1c0dc8898870b8e70d5aa9fa33bf9e5715f2532619c75073cbf4fd230d28b2d5"} Apr 22 19:33:52.766158 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.765790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" event={"ID":"530defbb-69c4-436d-9fbf-c85bea38c64e","Type":"ContainerStarted","Data":"3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd"} Apr 22 19:33:52.767159 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.767116 2569 generic.go:358] "Generic (PLEG): container finished" podID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerID="5bca959d3a8b9ee5b0d208e83cbf9f495bb533d97ee27929121034991816ed94" exitCode=0 Apr 22 19:33:52.767232 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.767177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerDied","Data":"5bca959d3a8b9ee5b0d208e83cbf9f495bb533d97ee27929121034991816ed94"} Apr 22 19:33:52.767232 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.767200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerStarted","Data":"4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd"} Apr 22 19:33:52.768595 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.768576 2569 generic.go:358] "Generic (PLEG): container finished" podID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerID="402e9126cc3f5d905ba23346fd70fdac57796f95d026cfb91d92295b94902cbb" exitCode=0 Apr 22 19:33:52.768677 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.768656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" event={"ID":"5c065f30-eb67-42aa-a23e-92556f94f4ad","Type":"ContainerDied","Data":"402e9126cc3f5d905ba23346fd70fdac57796f95d026cfb91d92295b94902cbb"} Apr 22 19:33:52.768730 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.768688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" event={"ID":"5c065f30-eb67-42aa-a23e-92556f94f4ad","Type":"ContainerStarted","Data":"27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6"} Apr 22 19:33:52.770085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.770059 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerID="3edce6aaa58cc4c6cccd2592b117a63e18e61d9c06310365268cd1f41c547db1" exitCode=0 Apr 22 19:33:52.770185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.770107 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerDied","Data":"3edce6aaa58cc4c6cccd2592b117a63e18e61d9c06310365268cd1f41c547db1"} Apr 22 19:33:52.770185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:52.770127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerStarted","Data":"9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0"} Apr 22 19:33:53.775265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:53.775214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerStarted","Data":"d6dfb0fbe17d1e988ba140f24ff6608622e9b7525a9457b17b7bc31d0a9fc273"} Apr 22 19:33:53.777262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:53.777082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerStarted","Data":"7357a252641891d6318df943a544dadd866c141df6a09b3b6d1906dd38db0afd"} Apr 22 19:33:53.779333 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:53.779297 2569 generic.go:358] "Generic (PLEG): container finished" podID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerID="5544e58d816341652e6c3731eaee1bddc837d2c613f89f5e1410b9dfa3bb25ad" exitCode=0 Apr 22 19:33:53.779432 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:53.779334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" event={"ID":"530defbb-69c4-436d-9fbf-c85bea38c64e","Type":"ContainerDied","Data":"5544e58d816341652e6c3731eaee1bddc837d2c613f89f5e1410b9dfa3bb25ad"} Apr 22 19:33:54.783792 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.783748 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerID="7357a252641891d6318df943a544dadd866c141df6a09b3b6d1906dd38db0afd" exitCode=0 Apr 22 19:33:54.784273 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.783837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerDied","Data":"7357a252641891d6318df943a544dadd866c141df6a09b3b6d1906dd38db0afd"} Apr 22 19:33:54.785918 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.785891 2569 generic.go:358] "Generic (PLEG): container finished" podID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerID="ce567f38236d38bbc2e2c303557b68562076698691a1cb9c310e0395f09fac2e" exitCode=0 Apr 22 19:33:54.786022 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.785970 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" event={"ID":"530defbb-69c4-436d-9fbf-c85bea38c64e","Type":"ContainerDied","Data":"ce567f38236d38bbc2e2c303557b68562076698691a1cb9c310e0395f09fac2e"} Apr 22 19:33:54.787605 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.787580 2569 generic.go:358] "Generic (PLEG): container finished" podID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerID="d6dfb0fbe17d1e988ba140f24ff6608622e9b7525a9457b17b7bc31d0a9fc273" exitCode=0 Apr 22 19:33:54.787709 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.787664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerDied","Data":"d6dfb0fbe17d1e988ba140f24ff6608622e9b7525a9457b17b7bc31d0a9fc273"} Apr 22 19:33:54.789446 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.789426 2569 generic.go:358] "Generic (PLEG): container finished" podID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerID="81c4f4921c391ab87f1b973d472304090a0ff69ff68c4b83c27a7fb0eed37aed" exitCode=0 Apr 22 19:33:54.789520 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:54.789449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" event={"ID":"5c065f30-eb67-42aa-a23e-92556f94f4ad","Type":"ContainerDied","Data":"81c4f4921c391ab87f1b973d472304090a0ff69ff68c4b83c27a7fb0eed37aed"} Apr 22 19:33:55.795402 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.795361 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerID="8c987b349a0857630f376557fd658c527642e59fbaed237f62893a40fcfa1c7c" exitCode=0 Apr 22 19:33:55.795831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.795438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerDied","Data":"8c987b349a0857630f376557fd658c527642e59fbaed237f62893a40fcfa1c7c"} Apr 22 19:33:55.797244 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.797220 2569 generic.go:358] "Generic (PLEG): container finished" podID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerID="940caa7c5523a8d75053c12e51539791ca73853134a6c87c8bb1a0b3222884b1" exitCode=0 Apr 22 19:33:55.797387 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.797304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerDied","Data":"940caa7c5523a8d75053c12e51539791ca73853134a6c87c8bb1a0b3222884b1"} Apr 22 19:33:55.799056 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.799032 2569 generic.go:358] "Generic (PLEG): container finished" podID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerID="ae69cc0d1d655e826c833acfc86e68acfa8157031b0bf134718d1865109954d8" exitCode=0 Apr 22 19:33:55.799157 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.799134 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" event={"ID":"5c065f30-eb67-42aa-a23e-92556f94f4ad","Type":"ContainerDied","Data":"ae69cc0d1d655e826c833acfc86e68acfa8157031b0bf134718d1865109954d8"} Apr 22 19:33:55.928294 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:55.928268 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:56.029825 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.029791 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfhlc\" (UniqueName: \"kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc\") pod \"530defbb-69c4-436d-9fbf-c85bea38c64e\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " Apr 22 19:33:56.029979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.029840 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle\") pod \"530defbb-69c4-436d-9fbf-c85bea38c64e\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " Apr 22 19:33:56.029979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.029861 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util\") pod \"530defbb-69c4-436d-9fbf-c85bea38c64e\" (UID: \"530defbb-69c4-436d-9fbf-c85bea38c64e\") " Apr 22 19:33:56.030374 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.030340 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle" (OuterVolumeSpecName: "bundle") pod "530defbb-69c4-436d-9fbf-c85bea38c64e" (UID: "530defbb-69c4-436d-9fbf-c85bea38c64e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:56.032135 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.032090 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc" (OuterVolumeSpecName: "kube-api-access-wfhlc") pod "530defbb-69c4-436d-9fbf-c85bea38c64e" (UID: "530defbb-69c4-436d-9fbf-c85bea38c64e"). InnerVolumeSpecName "kube-api-access-wfhlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:56.035285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.035252 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util" (OuterVolumeSpecName: "util") pod "530defbb-69c4-436d-9fbf-c85bea38c64e" (UID: "530defbb-69c4-436d-9fbf-c85bea38c64e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:56.131130 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.131005 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfhlc\" (UniqueName: \"kubernetes.io/projected/530defbb-69c4-436d-9fbf-c85bea38c64e-kube-api-access-wfhlc\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:56.131130 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.131050 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:56.131130 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.131060 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/530defbb-69c4-436d-9fbf-c85bea38c64e-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:56.803708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.803669 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" Apr 22 19:33:56.803708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.803674 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88z8d6m" event={"ID":"530defbb-69c4-436d-9fbf-c85bea38c64e","Type":"ContainerDied","Data":"3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd"} Apr 22 19:33:56.803708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.803713 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f957ad8cc8d5d458c6e9be1d6200f11b3dd88dc06fedc385b4c3f57bb0290cd" Apr 22 19:33:56.938941 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.938913 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:56.989273 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.989242 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:33:56.992248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:56.992228 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:57.039611 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039575 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util\") pod \"5c065f30-eb67-42aa-a23e-92556f94f4ad\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " Apr 22 19:33:57.039611 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle\") pod \"72478216-4930-46d9-bde5-0cc8c15d42fd\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " Apr 22 19:33:57.039905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039649 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle\") pod \"bf8ab3af-8dce-4deb-91c6-088d13208661\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " Apr 22 19:33:57.039905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039698 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util\") pod \"72478216-4930-46d9-bde5-0cc8c15d42fd\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " Apr 22 19:33:57.039905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039722 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle\") pod \"5c065f30-eb67-42aa-a23e-92556f94f4ad\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " Apr 22 19:33:57.039905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039747 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22qj4\" (UniqueName: \"kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4\") pod \"72478216-4930-46d9-bde5-0cc8c15d42fd\" (UID: \"72478216-4930-46d9-bde5-0cc8c15d42fd\") " Apr 22 19:33:57.039905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039838 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnvnc\" (UniqueName: \"kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc\") pod \"bf8ab3af-8dce-4deb-91c6-088d13208661\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " Apr 22 19:33:57.040208 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.039871 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util\") pod \"bf8ab3af-8dce-4deb-91c6-088d13208661\" (UID: \"bf8ab3af-8dce-4deb-91c6-088d13208661\") " Apr 22 19:33:57.040302 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040269 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle" (OuterVolumeSpecName: "bundle") pod "5c065f30-eb67-42aa-a23e-92556f94f4ad" (UID: "5c065f30-eb67-42aa-a23e-92556f94f4ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.040356 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040292 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle" (OuterVolumeSpecName: "bundle") pod "72478216-4930-46d9-bde5-0cc8c15d42fd" (UID: "72478216-4930-46d9-bde5-0cc8c15d42fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.040481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040456 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nggcp\" (UniqueName: \"kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp\") pod \"5c065f30-eb67-42aa-a23e-92556f94f4ad\" (UID: \"5c065f30-eb67-42aa-a23e-92556f94f4ad\") " Apr 22 19:33:57.040812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040698 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.040812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040716 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.040958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.040920 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle" (OuterVolumeSpecName: "bundle") pod "bf8ab3af-8dce-4deb-91c6-088d13208661" (UID: "bf8ab3af-8dce-4deb-91c6-088d13208661"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.043230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.043207 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4" (OuterVolumeSpecName: "kube-api-access-22qj4") pod "72478216-4930-46d9-bde5-0cc8c15d42fd" (UID: "72478216-4930-46d9-bde5-0cc8c15d42fd"). InnerVolumeSpecName "kube-api-access-22qj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:57.043230 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.043203 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc" (OuterVolumeSpecName: "kube-api-access-dnvnc") pod "bf8ab3af-8dce-4deb-91c6-088d13208661" (UID: "bf8ab3af-8dce-4deb-91c6-088d13208661"). InnerVolumeSpecName "kube-api-access-dnvnc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:57.043369 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.043313 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp" (OuterVolumeSpecName: "kube-api-access-nggcp") pod "5c065f30-eb67-42aa-a23e-92556f94f4ad" (UID: "5c065f30-eb67-42aa-a23e-92556f94f4ad"). InnerVolumeSpecName "kube-api-access-nggcp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:57.045601 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.045582 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util" (OuterVolumeSpecName: "util") pod "72478216-4930-46d9-bde5-0cc8c15d42fd" (UID: "72478216-4930-46d9-bde5-0cc8c15d42fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.046987 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.046966 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util" (OuterVolumeSpecName: "util") pod "5c065f30-eb67-42aa-a23e-92556f94f4ad" (UID: "5c065f30-eb67-42aa-a23e-92556f94f4ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.047775 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.047757 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util" (OuterVolumeSpecName: "util") pod "bf8ab3af-8dce-4deb-91c6-088d13208661" (UID: "bf8ab3af-8dce-4deb-91c6-088d13208661"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141866 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72478216-4930-46d9-bde5-0cc8c15d42fd-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141897 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22qj4\" (UniqueName: \"kubernetes.io/projected/72478216-4930-46d9-bde5-0cc8c15d42fd-kube-api-access-22qj4\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141908 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnvnc\" (UniqueName: \"kubernetes.io/projected/bf8ab3af-8dce-4deb-91c6-088d13208661-kube-api-access-dnvnc\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141917 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141925 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nggcp\" (UniqueName: \"kubernetes.io/projected/5c065f30-eb67-42aa-a23e-92556f94f4ad-kube-api-access-nggcp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141934 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c065f30-eb67-42aa-a23e-92556f94f4ad-util\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.141955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.141942 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf8ab3af-8dce-4deb-91c6-088d13208661-bundle\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:33:57.814140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.814077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" event={"ID":"72478216-4930-46d9-bde5-0cc8c15d42fd","Type":"ContainerDied","Data":"4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd"} Apr 22 19:33:57.814140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.814141 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd64309b5b3c24295e131c2d1885c39a8102ac745f813fd2573ef6b50420cbd" Apr 22 19:33:57.814625 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.814157 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c302cj7x" Apr 22 19:33:57.815846 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.815822 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" Apr 22 19:33:57.815979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.815822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfq847" event={"ID":"5c065f30-eb67-42aa-a23e-92556f94f4ad","Type":"ContainerDied","Data":"27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6"} Apr 22 19:33:57.815979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.815936 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f42e7ffbb0b8961327f0d1ca6c303458fcc0dd6fd269903d3d26e97ef291b6" Apr 22 19:33:57.817639 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.817612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" event={"ID":"bf8ab3af-8dce-4deb-91c6-088d13208661","Type":"ContainerDied","Data":"9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0"} Apr 22 19:33:57.817639 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.817639 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b418eeb24f6e4a64bf9b8abbc77acf6a84e0c255b8a3a4f34c23d95d7d763a0" Apr 22 19:33:57.817791 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:33:57.817645 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e50372jjj" Apr 22 19:34:02.966399 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966361 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk"] Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966639 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966650 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966664 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966671 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966679 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966684 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966691 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966697 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966703 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966709 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966714 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966720 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966727 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966731 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="extract" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966740 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966745 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966750 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966755 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966761 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966765 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966771 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966775 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="util" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966781 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="pull" Apr 22 19:34:02.966794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966786 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="pull" Apr 22 19:34:02.967481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966828 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf8ab3af-8dce-4deb-91c6-088d13208661" containerName="extract" Apr 22 19:34:02.967481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966836 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c065f30-eb67-42aa-a23e-92556f94f4ad" containerName="extract" Apr 22 19:34:02.967481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966843 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="530defbb-69c4-436d-9fbf-c85bea38c64e" containerName="extract" Apr 22 19:34:02.967481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.966851 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="72478216-4930-46d9-bde5-0cc8c15d42fd" containerName="extract" Apr 22 19:34:02.969758 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.969739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:02.972727 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.972704 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:34:02.972964 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.972944 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:34:02.974017 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.973998 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-9lwmn\"" Apr 22 19:34:02.983614 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.983585 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk"] Apr 22 19:34:02.989390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:02.989363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8s24\" (UniqueName: \"kubernetes.io/projected/41c4f928-0361-4948-8ed4-a14000b7e054-kube-api-access-m8s24\") pod \"limitador-operator-controller-manager-c7fb4c8d5-c62xk\" (UID: \"41c4f928-0361-4948-8ed4-a14000b7e054\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:03.089997 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:03.089955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8s24\" (UniqueName: \"kubernetes.io/projected/41c4f928-0361-4948-8ed4-a14000b7e054-kube-api-access-m8s24\") pod \"limitador-operator-controller-manager-c7fb4c8d5-c62xk\" (UID: \"41c4f928-0361-4948-8ed4-a14000b7e054\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:03.100436 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:03.100399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8s24\" (UniqueName: \"kubernetes.io/projected/41c4f928-0361-4948-8ed4-a14000b7e054-kube-api-access-m8s24\") pod \"limitador-operator-controller-manager-c7fb4c8d5-c62xk\" (UID: \"41c4f928-0361-4948-8ed4-a14000b7e054\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:03.279831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:03.279733 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:03.451353 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:03.451316 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk"] Apr 22 19:34:03.454528 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:34:03.454499 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c4f928_0361_4948_8ed4_a14000b7e054.slice/crio-d60d7623da417f2d270d161b9025e92a462c03f32cee26366022bd87ead0258a WatchSource:0}: Error finding container d60d7623da417f2d270d161b9025e92a462c03f32cee26366022bd87ead0258a: Status 404 returned error can't find the container with id d60d7623da417f2d270d161b9025e92a462c03f32cee26366022bd87ead0258a Apr 22 19:34:03.839802 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:03.839770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" event={"ID":"41c4f928-0361-4948-8ed4-a14000b7e054","Type":"ContainerStarted","Data":"d60d7623da417f2d270d161b9025e92a462c03f32cee26366022bd87ead0258a"} Apr 22 19:34:06.852724 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:06.852675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" event={"ID":"41c4f928-0361-4948-8ed4-a14000b7e054","Type":"ContainerStarted","Data":"4bd6c495a6a0a35f5a8e2fe26ebe9efe0d972e478c225534eb00b652f419b38f"} Apr 22 19:34:06.853259 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:06.852742 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:06.873790 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:06.873731 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" podStartSLOduration=1.980703687 podStartE2EDuration="4.873716724s" podCreationTimestamp="2026-04-22 19:34:02 +0000 UTC" firstStartedPulling="2026-04-22 19:34:03.456428513 +0000 UTC m=+616.104443295" lastFinishedPulling="2026-04-22 19:34:06.349441549 +0000 UTC m=+618.997456332" observedRunningTime="2026-04-22 19:34:06.872934275 +0000 UTC m=+619.520949080" watchObservedRunningTime="2026-04-22 19:34:06.873716724 +0000 UTC m=+619.521731528" Apr 22 19:34:08.932924 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:08.932883 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-txgjc"] Apr 22 19:34:08.965954 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:08.965915 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-txgjc"] Apr 22 19:34:08.966154 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:08.966046 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:08.968659 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:08.968635 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-bm5lp\"" Apr 22 19:34:09.036789 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.036756 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgmc\" (UniqueName: \"kubernetes.io/projected/5d7e87a7-8512-4326-8ce5-2e37c4dc83bf-kube-api-access-rzgmc\") pod \"authorino-operator-7587b89b76-txgjc\" (UID: \"5d7e87a7-8512-4326-8ce5-2e37c4dc83bf\") " pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:09.138012 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.137967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgmc\" (UniqueName: \"kubernetes.io/projected/5d7e87a7-8512-4326-8ce5-2e37c4dc83bf-kube-api-access-rzgmc\") pod \"authorino-operator-7587b89b76-txgjc\" (UID: \"5d7e87a7-8512-4326-8ce5-2e37c4dc83bf\") " pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:09.149436 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.149399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgmc\" (UniqueName: \"kubernetes.io/projected/5d7e87a7-8512-4326-8ce5-2e37c4dc83bf-kube-api-access-rzgmc\") pod \"authorino-operator-7587b89b76-txgjc\" (UID: \"5d7e87a7-8512-4326-8ce5-2e37c4dc83bf\") " pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:09.275863 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.275837 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:09.419639 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.419608 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-txgjc"] Apr 22 19:34:09.421745 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:34:09.421709 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7e87a7_8512_4326_8ce5_2e37c4dc83bf.slice/crio-43d9b1a015d1392ca078b286eb7bd066a83e64ddd0a33bc4f78e165ab18db299 WatchSource:0}: Error finding container 43d9b1a015d1392ca078b286eb7bd066a83e64ddd0a33bc4f78e165ab18db299: Status 404 returned error can't find the container with id 43d9b1a015d1392ca078b286eb7bd066a83e64ddd0a33bc4f78e165ab18db299 Apr 22 19:34:09.866822 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:09.866780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" event={"ID":"5d7e87a7-8512-4326-8ce5-2e37c4dc83bf","Type":"ContainerStarted","Data":"43d9b1a015d1392ca078b286eb7bd066a83e64ddd0a33bc4f78e165ab18db299"} Apr 22 19:34:11.875401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:11.875363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" event={"ID":"5d7e87a7-8512-4326-8ce5-2e37c4dc83bf","Type":"ContainerStarted","Data":"4e28e7333f5e5eb01eb54e5f22590194d1045a1d353e64fea6ef656e04286980"} Apr 22 19:34:11.875799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:11.875454 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:11.898887 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:11.898831 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" podStartSLOduration=2.057471748 podStartE2EDuration="3.898813547s" podCreationTimestamp="2026-04-22 19:34:08 +0000 UTC" firstStartedPulling="2026-04-22 19:34:09.424785396 +0000 UTC m=+622.072800187" lastFinishedPulling="2026-04-22 19:34:11.266127201 +0000 UTC m=+623.914141986" observedRunningTime="2026-04-22 19:34:11.897437962 +0000 UTC m=+624.545452767" watchObservedRunningTime="2026-04-22 19:34:11.898813547 +0000 UTC m=+624.546828350" Apr 22 19:34:17.858445 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:17.858410 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-c62xk" Apr 22 19:34:22.882540 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:22.882500 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-txgjc" Apr 22 19:34:56.610769 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.610729 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-tgt6j"] Apr 22 19:34:56.613528 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.613513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.616007 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.615983 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 19:34:56.616154 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.616133 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-gl48n\"" Apr 22 19:34:56.623902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.623879 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-tgt6j"] Apr 22 19:34:56.650200 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.650152 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-tgt6j"] Apr 22 19:34:56.733085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.733049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2be03193-fbc2-4c07-b0e5-a853c833c1fc-config-file\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.733284 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.733168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvjb\" (UniqueName: \"kubernetes.io/projected/2be03193-fbc2-4c07-b0e5-a853c833c1fc-kube-api-access-bkvjb\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.833688 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.833647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvjb\" (UniqueName: \"kubernetes.io/projected/2be03193-fbc2-4c07-b0e5-a853c833c1fc-kube-api-access-bkvjb\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.833881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.833701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2be03193-fbc2-4c07-b0e5-a853c833c1fc-config-file\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.834310 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.834292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2be03193-fbc2-4c07-b0e5-a853c833c1fc-config-file\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.842563 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.842540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvjb\" (UniqueName: \"kubernetes.io/projected/2be03193-fbc2-4c07-b0e5-a853c833c1fc-kube-api-access-bkvjb\") pod \"limitador-limitador-67566c68b4-tgt6j\" (UID: \"2be03193-fbc2-4c07-b0e5-a853c833c1fc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:56.924501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:56.924409 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:34:57.054425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:57.054391 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-tgt6j"] Apr 22 19:34:57.057399 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:34:57.057360 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be03193_fbc2_4c07_b0e5_a853c833c1fc.slice/crio-56757530c958bc5e3af3630df18b8ef9fbf56fefa7b7586c41bc014ea273f3cc WatchSource:0}: Error finding container 56757530c958bc5e3af3630df18b8ef9fbf56fefa7b7586c41bc014ea273f3cc: Status 404 returned error can't find the container with id 56757530c958bc5e3af3630df18b8ef9fbf56fefa7b7586c41bc014ea273f3cc Apr 22 19:34:58.038650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:34:58.038612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" event={"ID":"2be03193-fbc2-4c07-b0e5-a853c833c1fc","Type":"ContainerStarted","Data":"56757530c958bc5e3af3630df18b8ef9fbf56fefa7b7586c41bc014ea273f3cc"} Apr 22 19:35:01.052476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:01.052426 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" event={"ID":"2be03193-fbc2-4c07-b0e5-a853c833c1fc","Type":"ContainerStarted","Data":"b0851810474219b760a3c6993f1b24d7750b1dfddb38e059535cb404f38101f6"} Apr 22 19:35:01.052476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:01.052476 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:35:01.073209 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:01.073150 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" podStartSLOduration=1.258324662 podStartE2EDuration="5.07313498s" podCreationTimestamp="2026-04-22 19:34:56 +0000 UTC" firstStartedPulling="2026-04-22 19:34:57.059141153 +0000 UTC m=+669.707155934" lastFinishedPulling="2026-04-22 19:35:00.87395147 +0000 UTC m=+673.521966252" observedRunningTime="2026-04-22 19:35:01.071340334 +0000 UTC m=+673.719355167" watchObservedRunningTime="2026-04-22 19:35:01.07313498 +0000 UTC m=+673.721149780" Apr 22 19:35:12.057394 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:12.057362 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-tgt6j" Apr 22 19:35:35.573623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.573513 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:35:35.574172 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.573826 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" containerName="discovery" containerID="cri-o://50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6" gracePeriod=30 Apr 22 19:35:35.830801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.830732 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:35:35.954206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954172 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954394 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954266 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954394 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954321 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvmck\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954394 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954360 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954551 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954404 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954551 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954427 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954551 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954464 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token\") pod \"a51228d5-4229-4181-a691-5eb6ead8841f\" (UID: \"a51228d5-4229-4181-a691-5eb6ead8841f\") " Apr 22 19:35:35.954978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.954942 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:35:35.957155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957122 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:35.957284 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957169 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:35.957448 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957415 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck" (OuterVolumeSpecName: "kube-api-access-bvmck") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "kube-api-access-bvmck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:35.957448 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957428 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs" (OuterVolumeSpecName: "local-certs") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:35:35.957660 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957544 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts" (OuterVolumeSpecName: "cacerts") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:35.957660 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:35.957559 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token" (OuterVolumeSpecName: "istio-token") pod "a51228d5-4229-4181-a691-5eb6ead8841f" (UID: "a51228d5-4229-4181-a691-5eb6ead8841f"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:36.055520 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055478 2569 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a51228d5-4229-4181-a691-5eb6ead8841f-local-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055520 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055513 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvmck\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-kube-api-access-bvmck\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055520 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055524 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-dns-cert\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055755 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055533 2569 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-cacerts\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055755 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055543 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a51228d5-4229-4181-a691-5eb6ead8841f-istio-csr-ca-configmap\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055755 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055552 2569 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a51228d5-4229-4181-a691-5eb6ead8841f-istio-token\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.055755 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.055561 2569 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a51228d5-4229-4181-a691-5eb6ead8841f-istio-kubeconfig\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:35:36.179460 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.179367 2569 generic.go:358] "Generic (PLEG): container finished" podID="a51228d5-4229-4181-a691-5eb6ead8841f" containerID="50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6" exitCode=0 Apr 22 19:35:36.179460 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.179452 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" Apr 22 19:35:36.179650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.179459 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" event={"ID":"a51228d5-4229-4181-a691-5eb6ead8841f","Type":"ContainerDied","Data":"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6"} Apr 22 19:35:36.179650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.179495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg" event={"ID":"a51228d5-4229-4181-a691-5eb6ead8841f","Type":"ContainerDied","Data":"5482df703d2d3b7529e77ffd85f24589c095f29bc957247d88d81b2347483869"} Apr 22 19:35:36.179650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.179510 2569 scope.go:117] "RemoveContainer" containerID="50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6" Apr 22 19:35:36.188640 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.188622 2569 scope.go:117] "RemoveContainer" containerID="50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6" Apr 22 19:35:36.188943 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:35:36.188910 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6\": container with ID starting with 50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6 not found: ID does not exist" containerID="50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6" Apr 22 19:35:36.188994 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.188952 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6"} err="failed to get container status \"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6\": rpc error: code = NotFound desc = could not find container \"50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6\": container with ID starting with 50ecaab714dd8938e36706dd606ebc45f7acb0b4a926d011a441a18a81244fb6 not found: ID does not exist" Apr 22 19:35:36.204214 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.204184 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:35:36.215090 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:36.215056 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-s5lhg"] Apr 22 19:35:37.952643 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:37.952613 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" path="/var/lib/kubelet/pods/a51228d5-4229-4181-a691-5eb6ead8841f/volumes" Apr 22 19:35:40.245804 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.245753 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:35:40.246295 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.246213 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" containerName="discovery" Apr 22 19:35:40.246295 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.246240 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" containerName="discovery" Apr 22 19:35:40.246441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.246314 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a51228d5-4229-4181-a691-5eb6ead8841f" containerName="discovery" Apr 22 19:35:40.249345 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.249324 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.251961 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.251939 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 19:35:40.252067 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.251950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-hd2fj\"" Apr 22 19:35:40.252067 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.251967 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:35:40.253272 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.253252 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:35:40.267312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.267283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:35:40.269067 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.269043 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-768c94fb69-x9cvs"] Apr 22 19:35:40.272294 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.272270 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.275050 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.275025 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 19:35:40.275590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.275573 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-x4qjh\"" Apr 22 19:35:40.281473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.281444 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-768c94fb69-x9cvs"] Apr 22 19:35:40.304110 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.304064 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-q8hsv"] Apr 22 19:35:40.307071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.307054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.309830 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.309807 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:35:40.310029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.310005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9cm68\"" Apr 22 19:35:40.321846 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.321817 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-q8hsv"] Apr 22 19:35:40.393275 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj9c\" (UniqueName: \"kubernetes.io/projected/c42cce3c-2847-4703-aa07-1c53dbf3a75f-kube-api-access-jgj9c\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.393275 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393275 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhp8r\" (UniqueName: \"kubernetes.io/projected/c777dda1-33cf-445a-91a2-15b066fd5d2e-kube-api-access-nhp8r\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.393493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42cce3c-2847-4703-aa07-1c53dbf3a75f-cert\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.393493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393350 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c777dda1-33cf-445a-91a2-15b066fd5d2e-data\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.393493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393434 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.393493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.393462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wx7x\" (UniqueName: \"kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.494259 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494211 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c777dda1-33cf-445a-91a2-15b066fd5d2e-data\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.494437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.494437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wx7x\" (UniqueName: \"kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.494437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj9c\" (UniqueName: \"kubernetes.io/projected/c42cce3c-2847-4703-aa07-1c53dbf3a75f-kube-api-access-jgj9c\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.494597 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:35:40.494477 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 19:35:40.494597 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:35:40.494551 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert podName:1f7dd602-d0d9-42a0-8b43-bc1faf7e9430 nodeName:}" failed. No retries permitted until 2026-04-22 19:35:40.994530053 +0000 UTC m=+713.642544858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert") pod "kserve-controller-manager-545d8995fb-k7krt" (UID: "1f7dd602-d0d9-42a0-8b43-bc1faf7e9430") : secret "kserve-webhook-server-cert" not found Apr 22 19:35:40.494597 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhp8r\" (UniqueName: \"kubernetes.io/projected/c777dda1-33cf-445a-91a2-15b066fd5d2e-kube-api-access-nhp8r\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.494741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42cce3c-2847-4703-aa07-1c53dbf3a75f-cert\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.494741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.494647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c777dda1-33cf-445a-91a2-15b066fd5d2e-data\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.497255 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.497199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42cce3c-2847-4703-aa07-1c53dbf3a75f-cert\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.507945 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.507912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhp8r\" (UniqueName: \"kubernetes.io/projected/c777dda1-33cf-445a-91a2-15b066fd5d2e-kube-api-access-nhp8r\") pod \"seaweedfs-86cc847c5c-q8hsv\" (UID: \"c777dda1-33cf-445a-91a2-15b066fd5d2e\") " pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.508186 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.508166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wx7x\" (UniqueName: \"kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:40.508331 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.508315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj9c\" (UniqueName: \"kubernetes.io/projected/c42cce3c-2847-4703-aa07-1c53dbf3a75f-kube-api-access-jgj9c\") pod \"llmisvc-controller-manager-768c94fb69-x9cvs\" (UID: \"c42cce3c-2847-4703-aa07-1c53dbf3a75f\") " pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.583524 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.583494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:40.616554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.616521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:40.727573 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.727543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-768c94fb69-x9cvs"] Apr 22 19:35:40.728022 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:35:40.727993 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc42cce3c_2847_4703_aa07_1c53dbf3a75f.slice/crio-c733fbf7992b035e911bbbcbac95fd02e4fd824c46ac5efdd5b16835d7f62db4 WatchSource:0}: Error finding container c733fbf7992b035e911bbbcbac95fd02e4fd824c46ac5efdd5b16835d7f62db4: Status 404 returned error can't find the container with id c733fbf7992b035e911bbbcbac95fd02e4fd824c46ac5efdd5b16835d7f62db4 Apr 22 19:35:40.764500 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:40.764474 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-q8hsv"] Apr 22 19:35:40.766371 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:35:40.766339 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc777dda1_33cf_445a_91a2_15b066fd5d2e.slice/crio-f5e2e743bb25aa0336488cae777263b4badc94e661ca66ea363cae29435d3415 WatchSource:0}: Error finding container f5e2e743bb25aa0336488cae777263b4badc94e661ca66ea363cae29435d3415: Status 404 returned error can't find the container with id f5e2e743bb25aa0336488cae777263b4badc94e661ca66ea363cae29435d3415 Apr 22 19:35:41.000155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.000116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:41.002673 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.002627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") pod \"kserve-controller-manager-545d8995fb-k7krt\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:41.161149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.161054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:41.205236 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.204908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" event={"ID":"c42cce3c-2847-4703-aa07-1c53dbf3a75f","Type":"ContainerStarted","Data":"c733fbf7992b035e911bbbcbac95fd02e4fd824c46ac5efdd5b16835d7f62db4"} Apr 22 19:35:41.206198 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.206165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-q8hsv" event={"ID":"c777dda1-33cf-445a-91a2-15b066fd5d2e","Type":"ContainerStarted","Data":"f5e2e743bb25aa0336488cae777263b4badc94e661ca66ea363cae29435d3415"} Apr 22 19:35:41.323757 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:41.323726 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:35:41.338548 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:35:41.338505 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7dd602_d0d9_42a0_8b43_bc1faf7e9430.slice/crio-a2df2f27cbd38738bb26fd73cc81dd1efb81d38704006a7a4068ad39abee51d1 WatchSource:0}: Error finding container a2df2f27cbd38738bb26fd73cc81dd1efb81d38704006a7a4068ad39abee51d1: Status 404 returned error can't find the container with id a2df2f27cbd38738bb26fd73cc81dd1efb81d38704006a7a4068ad39abee51d1 Apr 22 19:35:42.212608 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:42.212569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" event={"ID":"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430","Type":"ContainerStarted","Data":"a2df2f27cbd38738bb26fd73cc81dd1efb81d38704006a7a4068ad39abee51d1"} Apr 22 19:35:46.232662 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.232618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-q8hsv" event={"ID":"c777dda1-33cf-445a-91a2-15b066fd5d2e","Type":"ContainerStarted","Data":"9c118b3935638e32cf306b642a5ea040cd235a20fa5d76884fb773d7ca288a5c"} Apr 22 19:35:46.233141 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.232669 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:35:46.234155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.234123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" event={"ID":"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430","Type":"ContainerStarted","Data":"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37"} Apr 22 19:35:46.234291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.234255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:35:46.235493 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.235471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" event={"ID":"c42cce3c-2847-4703-aa07-1c53dbf3a75f","Type":"ContainerStarted","Data":"3c2cdc588fa31c4a63a1c0bfaa5a567a91176ae13e89f5e9c0f0109dc41df9a7"} Apr 22 19:35:46.235599 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.235572 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:35:46.253328 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.253277 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-q8hsv" podStartSLOduration=1.3424698400000001 podStartE2EDuration="6.253259948s" podCreationTimestamp="2026-04-22 19:35:40 +0000 UTC" firstStartedPulling="2026-04-22 19:35:40.767725882 +0000 UTC m=+713.415740665" lastFinishedPulling="2026-04-22 19:35:45.678515982 +0000 UTC m=+718.326530773" observedRunningTime="2026-04-22 19:35:46.250753473 +0000 UTC m=+718.898768278" watchObservedRunningTime="2026-04-22 19:35:46.253259948 +0000 UTC m=+718.901274755" Apr 22 19:35:46.268279 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.268213 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" podStartSLOduration=1.973172234 podStartE2EDuration="6.268197113s" podCreationTimestamp="2026-04-22 19:35:40 +0000 UTC" firstStartedPulling="2026-04-22 19:35:41.340126621 +0000 UTC m=+713.988141414" lastFinishedPulling="2026-04-22 19:35:45.635151511 +0000 UTC m=+718.283166293" observedRunningTime="2026-04-22 19:35:46.266278504 +0000 UTC m=+718.914293311" watchObservedRunningTime="2026-04-22 19:35:46.268197113 +0000 UTC m=+718.916211916" Apr 22 19:35:46.283356 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:46.283291 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" podStartSLOduration=1.389066005 podStartE2EDuration="6.283277161s" podCreationTimestamp="2026-04-22 19:35:40 +0000 UTC" firstStartedPulling="2026-04-22 19:35:40.729187876 +0000 UTC m=+713.377202660" lastFinishedPulling="2026-04-22 19:35:45.623399034 +0000 UTC m=+718.271413816" observedRunningTime="2026-04-22 19:35:46.281237556 +0000 UTC m=+718.929252361" watchObservedRunningTime="2026-04-22 19:35:46.283277161 +0000 UTC m=+718.931291965" Apr 22 19:35:52.242049 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:35:52.242007 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-q8hsv" Apr 22 19:36:17.241712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:17.241676 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-768c94fb69-x9cvs" Apr 22 19:36:17.244565 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:17.244543 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:36:18.442955 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.442919 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:36:18.443342 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.443160 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" podUID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" containerName="manager" containerID="cri-o://6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37" gracePeriod=10 Apr 22 19:36:18.467931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.467905 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-bmn9x"] Apr 22 19:36:18.471365 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.471342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.479472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.479442 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-bmn9x"] Apr 22 19:36:18.497250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.497214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/975d2e8d-8fee-4ec8-827c-3fae179595dc-cert\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.497432 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.497326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9vz\" (UniqueName: \"kubernetes.io/projected/975d2e8d-8fee-4ec8-827c-3fae179595dc-kube-api-access-9t9vz\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.598550 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.598513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9vz\" (UniqueName: \"kubernetes.io/projected/975d2e8d-8fee-4ec8-827c-3fae179595dc-kube-api-access-9t9vz\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.598707 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.598570 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/975d2e8d-8fee-4ec8-827c-3fae179595dc-cert\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.601327 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.601299 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/975d2e8d-8fee-4ec8-827c-3fae179595dc-cert\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.607977 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.607952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9vz\" (UniqueName: \"kubernetes.io/projected/975d2e8d-8fee-4ec8-827c-3fae179595dc-kube-api-access-9t9vz\") pod \"kserve-controller-manager-545d8995fb-bmn9x\" (UID: \"975d2e8d-8fee-4ec8-827c-3fae179595dc\") " pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.682275 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.682251 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:36:18.699423 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.699283 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") pod \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " Apr 22 19:36:18.699423 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.699320 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wx7x\" (UniqueName: \"kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x\") pod \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\" (UID: \"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430\") " Apr 22 19:36:18.701984 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.701952 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert" (OuterVolumeSpecName: "cert") pod "1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" (UID: "1f7dd602-d0d9-42a0-8b43-bc1faf7e9430"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:36:18.702137 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.702040 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x" (OuterVolumeSpecName: "kube-api-access-9wx7x") pod "1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" (UID: "1f7dd602-d0d9-42a0-8b43-bc1faf7e9430"). InnerVolumeSpecName "kube-api-access-9wx7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:36:18.800841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.800798 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-cert\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:36:18.800841 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.800836 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9wx7x\" (UniqueName: \"kubernetes.io/projected/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430-kube-api-access-9wx7x\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:36:18.824714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.824677 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:18.952148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.952120 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-bmn9x"] Apr 22 19:36:18.953861 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:36:18.953826 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod975d2e8d_8fee_4ec8_827c_3fae179595dc.slice/crio-944e3a94b6a846254eb8d0e03b58bf90fb34603c77b2af1847b48b593aba25ea WatchSource:0}: Error finding container 944e3a94b6a846254eb8d0e03b58bf90fb34603c77b2af1847b48b593aba25ea: Status 404 returned error can't find the container with id 944e3a94b6a846254eb8d0e03b58bf90fb34603c77b2af1847b48b593aba25ea Apr 22 19:36:18.955106 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:18.955077 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:36:19.360947 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.360918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" event={"ID":"975d2e8d-8fee-4ec8-827c-3fae179595dc","Type":"ContainerStarted","Data":"944e3a94b6a846254eb8d0e03b58bf90fb34603c77b2af1847b48b593aba25ea"} Apr 22 19:36:19.361978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.361957 2569 generic.go:358] "Generic (PLEG): container finished" podID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" containerID="6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37" exitCode=0 Apr 22 19:36:19.362045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.361996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" event={"ID":"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430","Type":"ContainerDied","Data":"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37"} Apr 22 19:36:19.362045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.362014 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" Apr 22 19:36:19.362045 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.362025 2569 scope.go:117] "RemoveContainer" containerID="6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37" Apr 22 19:36:19.362217 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.362015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-k7krt" event={"ID":"1f7dd602-d0d9-42a0-8b43-bc1faf7e9430","Type":"ContainerDied","Data":"a2df2f27cbd38738bb26fd73cc81dd1efb81d38704006a7a4068ad39abee51d1"} Apr 22 19:36:19.374825 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.374800 2569 scope.go:117] "RemoveContainer" containerID="6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37" Apr 22 19:36:19.375110 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:36:19.375078 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37\": container with ID starting with 6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37 not found: ID does not exist" containerID="6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37" Apr 22 19:36:19.375198 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.375131 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37"} err="failed to get container status \"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37\": rpc error: code = NotFound desc = could not find container \"6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37\": container with ID starting with 6ff431ca05e2946808cd057862d7748ffb43402526f3c5e5a935bb74cd57ec37 not found: ID does not exist" Apr 22 19:36:19.387361 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.387335 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:36:19.390302 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.390276 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-k7krt"] Apr 22 19:36:19.952009 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:19.951973 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" path="/var/lib/kubelet/pods/1f7dd602-d0d9-42a0-8b43-bc1faf7e9430/volumes" Apr 22 19:36:20.370688 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:20.370594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" event={"ID":"975d2e8d-8fee-4ec8-827c-3fae179595dc","Type":"ContainerStarted","Data":"87ef60dd39913d2e1acf5f3a45e391e70825fda44059cdc0bc0aeb41582547dc"} Apr 22 19:36:20.370847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:20.370709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:20.390837 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:20.390784 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" podStartSLOduration=2.005360542 podStartE2EDuration="2.390767403s" podCreationTimestamp="2026-04-22 19:36:18 +0000 UTC" firstStartedPulling="2026-04-22 19:36:18.955221586 +0000 UTC m=+751.603236372" lastFinishedPulling="2026-04-22 19:36:19.340628451 +0000 UTC m=+751.988643233" observedRunningTime="2026-04-22 19:36:20.388488234 +0000 UTC m=+753.036503038" watchObservedRunningTime="2026-04-22 19:36:20.390767403 +0000 UTC m=+753.038782206" Apr 22 19:36:51.380763 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:51.380720 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-bmn9x" Apr 22 19:36:52.190148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.190110 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tjsmp"] Apr 22 19:36:52.190457 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.190444 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" containerName="manager" Apr 22 19:36:52.190508 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.190459 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" containerName="manager" Apr 22 19:36:52.190542 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.190522 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f7dd602-d0d9-42a0-8b43-bc1faf7e9430" containerName="manager" Apr 22 19:36:52.194554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.194526 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.196918 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.196893 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 19:36:52.197050 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.196901 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-fq2dx\"" Apr 22 19:36:52.202356 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.201676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tjsmp"] Apr 22 19:36:52.204489 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.204465 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-z28l8"] Apr 22 19:36:52.207543 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.207521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.210193 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.210166 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 19:36:52.210312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.210243 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mgjgv\"" Apr 22 19:36:52.216806 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.216781 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-z28l8"] Apr 22 19:36:52.262618 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.262583 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/595b3874-3bb5-4343-885b-50df17bddd1b-tls-certs\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.262805 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.262622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.262805 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.262660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46948\" (UniqueName: \"kubernetes.io/projected/595b3874-3bb5-4343-885b-50df17bddd1b-kube-api-access-46948\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.262805 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.262722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbvwd\" (UniqueName: \"kubernetes.io/projected/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-kube-api-access-tbvwd\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.363452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.363418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/595b3874-3bb5-4343-885b-50df17bddd1b-tls-certs\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.363642 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.363458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.363642 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:36:52.363575 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 19:36:52.363642 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.363591 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46948\" (UniqueName: \"kubernetes.io/projected/595b3874-3bb5-4343-885b-50df17bddd1b-kube-api-access-46948\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.363801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.363642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbvwd\" (UniqueName: \"kubernetes.io/projected/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-kube-api-access-tbvwd\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.363801 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:36:52.363665 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert podName:d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd nodeName:}" failed. No retries permitted until 2026-04-22 19:36:52.863642535 +0000 UTC m=+785.511657322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert") pod "odh-model-controller-696fc77849-z28l8" (UID: "d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd") : secret "odh-model-controller-webhook-cert" not found Apr 22 19:36:52.366215 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.366177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/595b3874-3bb5-4343-885b-50df17bddd1b-tls-certs\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.372675 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.372650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbvwd\" (UniqueName: \"kubernetes.io/projected/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-kube-api-access-tbvwd\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.372854 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.372827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46948\" (UniqueName: \"kubernetes.io/projected/595b3874-3bb5-4343-885b-50df17bddd1b-kube-api-access-46948\") pod \"model-serving-api-86f7b4b499-tjsmp\" (UID: \"595b3874-3bb5-4343-885b-50df17bddd1b\") " pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.506989 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.506940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:52.635441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.635407 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tjsmp"] Apr 22 19:36:52.638331 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:36:52.638293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod595b3874_3bb5_4343_885b_50df17bddd1b.slice/crio-f25ce1c31f69f6f74d93f0ec45dd2345e4c57a1946330c3b800c7c6c0857529f WatchSource:0}: Error finding container f25ce1c31f69f6f74d93f0ec45dd2345e4c57a1946330c3b800c7c6c0857529f: Status 404 returned error can't find the container with id f25ce1c31f69f6f74d93f0ec45dd2345e4c57a1946330c3b800c7c6c0857529f Apr 22 19:36:52.867673 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.867582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:52.870178 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:52.870152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd-cert\") pod \"odh-model-controller-696fc77849-z28l8\" (UID: \"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd\") " pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:53.118857 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:53.118769 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:53.278260 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:53.278229 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-z28l8"] Apr 22 19:36:53.287178 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:36:53.287136 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e6cab3_ce19_45ec_9758_9e2c0f5d29cd.slice/crio-b50f02ec60166e8df953781850f760de8f0626686bcc012516c54f7e62df305e WatchSource:0}: Error finding container b50f02ec60166e8df953781850f760de8f0626686bcc012516c54f7e62df305e: Status 404 returned error can't find the container with id b50f02ec60166e8df953781850f760de8f0626686bcc012516c54f7e62df305e Apr 22 19:36:53.500796 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:53.500752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tjsmp" event={"ID":"595b3874-3bb5-4343-885b-50df17bddd1b","Type":"ContainerStarted","Data":"f25ce1c31f69f6f74d93f0ec45dd2345e4c57a1946330c3b800c7c6c0857529f"} Apr 22 19:36:53.502015 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:53.501987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-z28l8" event={"ID":"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd","Type":"ContainerStarted","Data":"b50f02ec60166e8df953781850f760de8f0626686bcc012516c54f7e62df305e"} Apr 22 19:36:54.507847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:54.507813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tjsmp" event={"ID":"595b3874-3bb5-4343-885b-50df17bddd1b","Type":"ContainerStarted","Data":"f42999251bed0650a971a047993e14a0a4048ef11d2f960a2f0463858fd372da"} Apr 22 19:36:54.508452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:54.507892 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:36:54.529263 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:54.529204 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tjsmp" podStartSLOduration=1.380139911 podStartE2EDuration="2.529181584s" podCreationTimestamp="2026-04-22 19:36:52 +0000 UTC" firstStartedPulling="2026-04-22 19:36:52.640254584 +0000 UTC m=+785.288269380" lastFinishedPulling="2026-04-22 19:36:53.789296271 +0000 UTC m=+786.437311053" observedRunningTime="2026-04-22 19:36:54.524756473 +0000 UTC m=+787.172771277" watchObservedRunningTime="2026-04-22 19:36:54.529181584 +0000 UTC m=+787.177196389" Apr 22 19:36:56.516883 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:56.516846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-z28l8" event={"ID":"d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd","Type":"ContainerStarted","Data":"ea4717d9eeb33271216fde7e1e92616a7c95d793bb63f90ba419ac17509f955a"} Apr 22 19:36:56.517328 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:56.516958 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:36:56.536139 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:36:56.536060 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-z28l8" podStartSLOduration=1.980060762 podStartE2EDuration="4.536044531s" podCreationTimestamp="2026-04-22 19:36:52 +0000 UTC" firstStartedPulling="2026-04-22 19:36:53.289286861 +0000 UTC m=+785.937301648" lastFinishedPulling="2026-04-22 19:36:55.845270635 +0000 UTC m=+788.493285417" observedRunningTime="2026-04-22 19:36:56.534468957 +0000 UTC m=+789.182483765" watchObservedRunningTime="2026-04-22 19:36:56.536044531 +0000 UTC m=+789.184059335" Apr 22 19:37:05.516058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:05.515970 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tjsmp" Apr 22 19:37:07.522330 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:07.522300 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-z28l8" Apr 22 19:37:08.322675 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.322640 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-nw9l2"] Apr 22 19:37:08.325784 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.325766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nw9l2" Apr 22 19:37:08.336523 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.336495 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nw9l2"] Apr 22 19:37:08.397704 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.397658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8tn\" (UniqueName: \"kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn\") pod \"s3-init-nw9l2\" (UID: \"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc\") " pod="kserve/s3-init-nw9l2" Apr 22 19:37:08.498875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.498839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8tn\" (UniqueName: \"kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn\") pod \"s3-init-nw9l2\" (UID: \"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc\") " pod="kserve/s3-init-nw9l2" Apr 22 19:37:08.507932 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.507906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8tn\" (UniqueName: \"kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn\") pod \"s3-init-nw9l2\" (UID: \"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc\") " pod="kserve/s3-init-nw9l2" Apr 22 19:37:08.635563 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.635475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nw9l2" Apr 22 19:37:08.763316 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:08.763288 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nw9l2"] Apr 22 19:37:08.765865 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:37:08.765842 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3ab1b0_053b_458b_bf44_8c266bd8d7cc.slice/crio-38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24 WatchSource:0}: Error finding container 38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24: Status 404 returned error can't find the container with id 38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24 Apr 22 19:37:09.575350 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:09.575303 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nw9l2" event={"ID":"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc","Type":"ContainerStarted","Data":"38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24"} Apr 22 19:37:13.593410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:13.593310 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nw9l2" event={"ID":"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc","Type":"ContainerStarted","Data":"d26d8e8e935100fd3485ad6919e6a817548437c274891c983955e33062e7dcc2"} Apr 22 19:37:13.612032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:13.611977 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-nw9l2" podStartSLOduration=1.1533873940000001 podStartE2EDuration="5.611962511s" podCreationTimestamp="2026-04-22 19:37:08 +0000 UTC" firstStartedPulling="2026-04-22 19:37:08.767835249 +0000 UTC m=+801.415850032" lastFinishedPulling="2026-04-22 19:37:13.226410367 +0000 UTC m=+805.874425149" observedRunningTime="2026-04-22 19:37:13.61087036 +0000 UTC m=+806.258885165" watchObservedRunningTime="2026-04-22 19:37:13.611962511 +0000 UTC m=+806.259977313" Apr 22 19:37:16.606174 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:16.606134 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b3ab1b0-053b-458b-bf44-8c266bd8d7cc" containerID="d26d8e8e935100fd3485ad6919e6a817548437c274891c983955e33062e7dcc2" exitCode=0 Apr 22 19:37:16.606627 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:16.606220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nw9l2" event={"ID":"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc","Type":"ContainerDied","Data":"d26d8e8e935100fd3485ad6919e6a817548437c274891c983955e33062e7dcc2"} Apr 22 19:37:17.745253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:17.745226 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nw9l2" Apr 22 19:37:17.881331 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:17.881243 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8tn\" (UniqueName: \"kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn\") pod \"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc\" (UID: \"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc\") " Apr 22 19:37:17.883527 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:17.883502 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn" (OuterVolumeSpecName: "kube-api-access-tw8tn") pod "9b3ab1b0-053b-458b-bf44-8c266bd8d7cc" (UID: "9b3ab1b0-053b-458b-bf44-8c266bd8d7cc"). InnerVolumeSpecName "kube-api-access-tw8tn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:37:17.982219 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:17.982177 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tw8tn\" (UniqueName: \"kubernetes.io/projected/9b3ab1b0-053b-458b-bf44-8c266bd8d7cc-kube-api-access-tw8tn\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:37:18.615046 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:18.615010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nw9l2" event={"ID":"9b3ab1b0-053b-458b-bf44-8c266bd8d7cc","Type":"ContainerDied","Data":"38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24"} Apr 22 19:37:18.615046 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:18.615044 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38e5d0f16ef0e1a19a41400ae51479360f962526f6d13922ce736bcdd708fe24" Apr 22 19:37:18.615046 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:18.615046 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nw9l2" Apr 22 19:37:28.600491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.600455 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf"] Apr 22 19:37:28.600888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.600779 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b3ab1b0-053b-458b-bf44-8c266bd8d7cc" containerName="s3-init" Apr 22 19:37:28.600888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.600791 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3ab1b0-053b-458b-bf44-8c266bd8d7cc" containerName="s3-init" Apr 22 19:37:28.600888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.600869 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b3ab1b0-053b-458b-bf44-8c266bd8d7cc" containerName="s3-init" Apr 22 19:37:28.605058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.605030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.608557 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.608532 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:37:28.609009 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.608790 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 19:37:28.609009 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.608905 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-qbwxj\"" Apr 22 19:37:28.609189 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.609159 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:37:28.621951 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.619875 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf"] Apr 22 19:37:28.773239 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773323 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30245c48-e56c-4e0b-a86d-74ffeda7575b-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773437 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773509 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwzz\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-kube-api-access-dqwzz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773549 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.773831 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.773589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwzz\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-kube-api-access-dqwzz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.874828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30245c48-e56c-4e0b-a86d-74ffeda7575b-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.875211 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.874926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.875211 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.875033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.875314 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.875218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.875314 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.875274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.875524 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.875500 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/30245c48-e56c-4e0b-a86d-74ffeda7575b-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.877816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.877790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.877912 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.877790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.885321 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.885290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.885521 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.885494 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwzz\" (UniqueName: \"kubernetes.io/projected/30245c48-e56c-4e0b-a86d-74ffeda7575b-kube-api-access-dqwzz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-n4lsf\" (UID: \"30245c48-e56c-4e0b-a86d-74ffeda7575b\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:28.917165 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:28.917090 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:29.073727 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.073695 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf"] Apr 22 19:37:29.075763 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:37:29.075712 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30245c48_e56c_4e0b_a86d_74ffeda7575b.slice/crio-5297e70cf547db5edad375963b869ab703cd761a05098f0b995f4772f30e1555 WatchSource:0}: Error finding container 5297e70cf547db5edad375963b869ab703cd761a05098f0b995f4772f30e1555: Status 404 returned error can't find the container with id 5297e70cf547db5edad375963b869ab703cd761a05098f0b995f4772f30e1555 Apr 22 19:37:29.078008 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.077974 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:37:29.078090 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.078041 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:37:29.078090 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.078071 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:37:29.661994 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.661952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" event={"ID":"30245c48-e56c-4e0b-a86d-74ffeda7575b","Type":"ContainerStarted","Data":"d2fb0641805bf7da6f9b91326e1057dbc5dae01db1ac9f2c1d43ba15179ce1c1"} Apr 22 19:37:29.661994 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.661999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" event={"ID":"30245c48-e56c-4e0b-a86d-74ffeda7575b","Type":"ContainerStarted","Data":"5297e70cf547db5edad375963b869ab703cd761a05098f0b995f4772f30e1555"} Apr 22 19:37:29.684410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.684356 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" podStartSLOduration=1.684339832 podStartE2EDuration="1.684339832s" podCreationTimestamp="2026-04-22 19:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:29.682588961 +0000 UTC m=+822.330603769" watchObservedRunningTime="2026-04-22 19:37:29.684339832 +0000 UTC m=+822.332354693" Apr 22 19:37:29.917288 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:29.917206 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:30.923123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:30.923074 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:31.670132 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:31.670081 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:31.671022 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:31.671005 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-n4lsf" Apr 22 19:37:38.407132 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.407078 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:37:38.418533 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.418502 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.419038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.419011 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:37:38.423036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.423004 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k6ghc\"" Apr 22 19:37:38.423213 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.423004 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-ztprn\"" Apr 22 19:37:38.423213 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.423004 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 19:37:38.456902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.456864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.456902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.456917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.457218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.456980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnl2t\" (UniqueName: \"kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.457218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.457030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.457218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.457080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.457218 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.457127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558523 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558577 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnl2t\" (UniqueName: \"kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558577 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558780 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558834 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.558920 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.558902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.561222 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.561198 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.567450 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.567407 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnl2t\" (UniqueName: \"kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-886559z767\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.729957 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.729922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:37:38.889886 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:38.889845 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:37:38.891846 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:37:38.891816 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cb0275_60bc_45a3_af63_772dfc5c283b.slice/crio-4b63110f3ce88950a25f6181a6676c18aa9ef81a7bbf5ce6e327c8499ff11314 WatchSource:0}: Error finding container 4b63110f3ce88950a25f6181a6676c18aa9ef81a7bbf5ce6e327c8499ff11314: Status 404 returned error can't find the container with id 4b63110f3ce88950a25f6181a6676c18aa9ef81a7bbf5ce6e327c8499ff11314 Apr 22 19:37:39.701276 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:39.701235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerStarted","Data":"4b63110f3ce88950a25f6181a6676c18aa9ef81a7bbf5ce6e327c8499ff11314"} Apr 22 19:37:42.715488 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:42.715451 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerStarted","Data":"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932"} Apr 22 19:37:43.720498 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:43.720451 2569 generic.go:358] "Generic (PLEG): container finished" podID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerID="09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932" exitCode=0 Apr 22 19:37:43.720886 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:43.720540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerDied","Data":"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932"} Apr 22 19:37:45.731140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:37:45.731084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerStarted","Data":"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05"} Apr 22 19:38:16.880262 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:16.880229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerStarted","Data":"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17"} Apr 22 19:38:16.880703 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:16.880489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:16.883140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:16.883117 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:16.905743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:16.905677 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" podStartSLOduration=1.845713878 podStartE2EDuration="38.905657573s" podCreationTimestamp="2026-04-22 19:37:38 +0000 UTC" firstStartedPulling="2026-04-22 19:37:38.89423468 +0000 UTC m=+831.542249469" lastFinishedPulling="2026-04-22 19:38:15.954178382 +0000 UTC m=+868.602193164" observedRunningTime="2026-04-22 19:38:16.902852059 +0000 UTC m=+869.550866865" watchObservedRunningTime="2026-04-22 19:38:16.905657573 +0000 UTC m=+869.553672381" Apr 22 19:38:18.730423 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:18.730381 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:18.730812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:18.730529 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:28.732786 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:28.732749 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:28.734019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:28.733998 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:30.255070 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:30.255028 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:38:30.255532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:30.255361 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="main" containerID="cri-o://00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05" gracePeriod=30 Apr 22 19:38:30.255532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:30.255405 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="tokenizer" containerID="cri-o://7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17" gracePeriod=30 Apr 22 19:38:30.935304 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:30.935267 2569 generic.go:358] "Generic (PLEG): container finished" podID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerID="00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05" exitCode=0 Apr 22 19:38:30.935491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:30.935315 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerDied","Data":"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05"} Apr 22 19:38:31.604233 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.604207 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:31.650731 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.650639 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.650731 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.650698 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.650953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.650734 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.650953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.650784 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.651047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.650957 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:31.651047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.651013 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:31.651195 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.651172 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:31.651567 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.651547 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:31.751293 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751255 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.751293 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751293 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnl2t\" (UniqueName: \"kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t\") pod \"12cb0275-60bc-45a3-af63-772dfc5c283b\" (UID: \"12cb0275-60bc-45a3-af63-772dfc5c283b\") " Apr 22 19:38:31.751541 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751465 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.751541 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751478 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.751541 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751487 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.751541 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.751496 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12cb0275-60bc-45a3-af63-772dfc5c283b-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.753655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.753621 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:38:31.753757 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.753660 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t" (OuterVolumeSpecName: "kube-api-access-gnl2t") pod "12cb0275-60bc-45a3-af63-772dfc5c283b" (UID: "12cb0275-60bc-45a3-af63-772dfc5c283b"). InnerVolumeSpecName "kube-api-access-gnl2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:38:31.852560 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.852504 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12cb0275-60bc-45a3-af63-772dfc5c283b-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.852560 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.852557 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnl2t\" (UniqueName: \"kubernetes.io/projected/12cb0275-60bc-45a3-af63-772dfc5c283b-kube-api-access-gnl2t\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:38:31.940989 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.940905 2569 generic.go:358] "Generic (PLEG): container finished" podID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerID="7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17" exitCode=0 Apr 22 19:38:31.941152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.940992 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" Apr 22 19:38:31.941152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.940991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerDied","Data":"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17"} Apr 22 19:38:31.941152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.941119 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767" event={"ID":"12cb0275-60bc-45a3-af63-772dfc5c283b","Type":"ContainerDied","Data":"4b63110f3ce88950a25f6181a6676c18aa9ef81a7bbf5ce6e327c8499ff11314"} Apr 22 19:38:31.941152 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.941143 2569 scope.go:117] "RemoveContainer" containerID="7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17" Apr 22 19:38:31.950397 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.950377 2569 scope.go:117] "RemoveContainer" containerID="00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05" Apr 22 19:38:31.959927 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.959906 2569 scope.go:117] "RemoveContainer" containerID="09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932" Apr 22 19:38:31.964192 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.964167 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:38:31.967599 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.967574 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-886559z767"] Apr 22 19:38:31.969658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.969636 2569 scope.go:117] "RemoveContainer" containerID="7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17" Apr 22 19:38:31.969951 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:38:31.969933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17\": container with ID starting with 7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17 not found: ID does not exist" containerID="7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17" Apr 22 19:38:31.970005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.969960 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17"} err="failed to get container status \"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17\": rpc error: code = NotFound desc = could not find container \"7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17\": container with ID starting with 7bc71f81daaee62e7d3a679516687d33506e95622eb3889181b1cc8b4359ce17 not found: ID does not exist" Apr 22 19:38:31.970005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.969978 2569 scope.go:117] "RemoveContainer" containerID="00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05" Apr 22 19:38:31.970274 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:38:31.970257 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05\": container with ID starting with 00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05 not found: ID does not exist" containerID="00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05" Apr 22 19:38:31.970335 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.970278 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05"} err="failed to get container status \"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05\": rpc error: code = NotFound desc = could not find container \"00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05\": container with ID starting with 00e2a47a3828c94133d96dfdc29ef47b6c93d52619d7ca03b462d234c7f81b05 not found: ID does not exist" Apr 22 19:38:31.970335 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.970292 2569 scope.go:117] "RemoveContainer" containerID="09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932" Apr 22 19:38:31.970514 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:38:31.970492 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932\": container with ID starting with 09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932 not found: ID does not exist" containerID="09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932" Apr 22 19:38:31.970555 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:31.970524 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932"} err="failed to get container status \"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932\": rpc error: code = NotFound desc = could not find container \"09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932\": container with ID starting with 09e9554a34eddeb33c01675e5d4d56af798019a9166ba501eff3fd69358ff932 not found: ID does not exist" Apr 22 19:38:33.951498 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:33.951466 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" path="/var/lib/kubelet/pods/12cb0275-60bc-45a3-af63-772dfc5c283b/volumes" Apr 22 19:38:36.134992 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.134960 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:38:36.135355 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135320 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="tokenizer" Apr 22 19:38:36.135355 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135331 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="tokenizer" Apr 22 19:38:36.135355 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135346 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="main" Apr 22 19:38:36.135355 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135351 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="main" Apr 22 19:38:36.135492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135362 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="storage-initializer" Apr 22 19:38:36.135492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135368 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="storage-initializer" Apr 22 19:38:36.135492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135422 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="main" Apr 22 19:38:36.135492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.135430 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="12cb0275-60bc-45a3-af63-772dfc5c283b" containerName="tokenizer" Apr 22 19:38:36.141591 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.141553 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.147209 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.147186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 19:38:36.147346 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.147186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k6ghc\"" Apr 22 19:38:36.156695 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.156667 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:38:36.184617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.184799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtrz\" (UniqueName: \"kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.184799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.184799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.184799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184778 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.184953 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.184806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.285847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.285799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286026 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.285880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286026 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.285904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtrz\" (UniqueName: \"kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286026 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.285979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286026 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.286021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286330 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.286167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286440 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.286418 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286523 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.286492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.286586 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.286550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.288576 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.288553 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.288675 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.288578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.295812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.295785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtrz\" (UniqueName: \"kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz\") pod \"scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.448176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.448138 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:38:36.451918 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.451895 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:36.452931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.452858 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.456328 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.456309 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-nd9cm\"" Apr 22 19:38:36.486924 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.486877 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:38:36.487787 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487760 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.487934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.487934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.487934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487858 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhvb\" (UniqueName: \"kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.487934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.487934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.487904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588688 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khhvb\" (UniqueName: \"kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.588885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.588838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.589177 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.589160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.589238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.589185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.589238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.589216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.589315 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.589256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.591345 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.591323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.600311 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.600289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhvb\" (UniqueName: \"kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.608226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.608202 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:38:36.610109 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:38:36.610071 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34d9e18_3ca8_4e66_a564_acd098fba7f1.slice/crio-35e90c5026704c07f9e865d5170a24cc7e69f96b849239334b69c8274c756f85 WatchSource:0}: Error finding container 35e90c5026704c07f9e865d5170a24cc7e69f96b849239334b69c8274c756f85: Status 404 returned error can't find the container with id 35e90c5026704c07f9e865d5170a24cc7e69f96b849239334b69c8274c756f85 Apr 22 19:38:36.777751 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.777651 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:36.922742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.922714 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:38:36.924414 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:38:36.924386 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b111da3_dc21_46f2_a35d_05b4979f0e86.slice/crio-0b4196f5c8d0f9fac931efb26fb8be061d4becfc68d4378205a9c34c46faef93 WatchSource:0}: Error finding container 0b4196f5c8d0f9fac931efb26fb8be061d4becfc68d4378205a9c34c46faef93: Status 404 returned error can't find the container with id 0b4196f5c8d0f9fac931efb26fb8be061d4becfc68d4378205a9c34c46faef93 Apr 22 19:38:36.961291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.961246 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerStarted","Data":"0b4196f5c8d0f9fac931efb26fb8be061d4becfc68d4378205a9c34c46faef93"} Apr 22 19:38:36.962695 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.962657 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerStarted","Data":"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83"} Apr 22 19:38:36.962695 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:36.962696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerStarted","Data":"35e90c5026704c07f9e865d5170a24cc7e69f96b849239334b69c8274c756f85"} Apr 22 19:38:37.968552 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:37.968517 2569 generic.go:358] "Generic (PLEG): container finished" podID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerID="ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7" exitCode=0 Apr 22 19:38:37.969021 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:37.968602 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerDied","Data":"ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7"} Apr 22 19:38:38.974730 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:38.974690 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerStarted","Data":"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61"} Apr 22 19:38:38.974730 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:38.974729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerStarted","Data":"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab"} Apr 22 19:38:38.975176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:38.974847 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:38.999082 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:38.999029 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" podStartSLOduration=2.999014719 podStartE2EDuration="2.999014719s" podCreationTimestamp="2026-04-22 19:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:38:38.997556452 +0000 UTC m=+891.645571257" watchObservedRunningTime="2026-04-22 19:38:38.999014719 +0000 UTC m=+891.647029523" Apr 22 19:38:41.991523 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:41.991484 2569 generic.go:358] "Generic (PLEG): container finished" podID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerID="9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83" exitCode=0 Apr 22 19:38:41.991991 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:41.991538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerDied","Data":"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83"} Apr 22 19:38:44.002577 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:44.002539 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerStarted","Data":"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d"} Apr 22 19:38:44.027406 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:44.027312 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" podStartSLOduration=6.940730216 podStartE2EDuration="8.027292927s" podCreationTimestamp="2026-04-22 19:38:36 +0000 UTC" firstStartedPulling="2026-04-22 19:38:41.992596979 +0000 UTC m=+894.640611764" lastFinishedPulling="2026-04-22 19:38:43.079159691 +0000 UTC m=+895.727174475" observedRunningTime="2026-04-22 19:38:44.023661356 +0000 UTC m=+896.671676161" watchObservedRunningTime="2026-04-22 19:38:44.027292927 +0000 UTC m=+896.675307731" Apr 22 19:38:46.452211 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.452169 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:46.452771 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.452265 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:46.464690 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.464654 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:46.778546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.778442 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:46.778546 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.778486 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:46.781352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:46.781324 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:47.015340 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:47.015310 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:38:47.025378 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:47.025353 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:38:47.914801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:47.914558 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:38:47.914801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:38:47.914573 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:39:08.019875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:08.019840 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:39:09.096672 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.096636 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:39:09.097118 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.097015 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="main" containerID="cri-o://4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d" gracePeriod=30 Apr 22 19:39:09.106570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.106532 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:39:09.106943 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.106911 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="tokenizer" containerID="cri-o://0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61" gracePeriod=30 Apr 22 19:39:09.107078 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.106891 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="main" containerID="cri-o://c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab" gracePeriod=30 Apr 22 19:39:09.354352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.354281 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:39:09.486535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486500 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486577 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486605 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486655 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486689 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtrz\" (UniqueName: \"kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486710 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm\") pod \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\" (UID: \"e34d9e18-3ca8-4e66-a564-acd098fba7f1\") " Apr 22 19:39:09.486968 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486780 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home" (OuterVolumeSpecName: "home") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:09.486968 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.486871 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache" (OuterVolumeSpecName: "model-cache") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:09.487073 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.487026 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.487073 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.487049 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.489036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.489007 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz" (OuterVolumeSpecName: "kube-api-access-wmtrz") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "kube-api-access-wmtrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:09.489036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.489029 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:09.489222 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.489050 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm" (OuterVolumeSpecName: "dshm") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:09.548303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.548244 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e34d9e18-3ca8-4e66-a564-acd098fba7f1" (UID: "e34d9e18-3ca8-4e66-a564-acd098fba7f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:09.587655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.587620 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e34d9e18-3ca8-4e66-a564-acd098fba7f1-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.587655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.587651 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.587655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.587662 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wmtrz\" (UniqueName: \"kubernetes.io/projected/e34d9e18-3ca8-4e66-a564-acd098fba7f1-kube-api-access-wmtrz\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.587976 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:09.587676 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e34d9e18-3ca8-4e66-a564-acd098fba7f1-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.103224 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.103190 2569 generic.go:358] "Generic (PLEG): container finished" podID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerID="c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab" exitCode=0 Apr 22 19:39:10.103698 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.103268 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerDied","Data":"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab"} Apr 22 19:39:10.104671 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.104645 2569 generic.go:358] "Generic (PLEG): container finished" podID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerID="4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d" exitCode=0 Apr 22 19:39:10.104811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.104732 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerDied","Data":"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d"} Apr 22 19:39:10.104811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.104767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" event={"ID":"e34d9e18-3ca8-4e66-a564-acd098fba7f1","Type":"ContainerDied","Data":"35e90c5026704c07f9e865d5170a24cc7e69f96b849239334b69c8274c756f85"} Apr 22 19:39:10.104811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.104782 2569 scope.go:117] "RemoveContainer" containerID="4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d" Apr 22 19:39:10.104956 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.104740 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6" Apr 22 19:39:10.119679 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.119647 2569 scope.go:117] "RemoveContainer" containerID="9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83" Apr 22 19:39:10.127550 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.127525 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:39:10.133408 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.133385 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-56f8fff7fc-7r5h6"] Apr 22 19:39:10.187194 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.187155 2569 scope.go:117] "RemoveContainer" containerID="4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d" Apr 22 19:39:10.187555 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:10.187532 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d\": container with ID starting with 4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d not found: ID does not exist" containerID="4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d" Apr 22 19:39:10.187617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.187565 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d"} err="failed to get container status \"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d\": rpc error: code = NotFound desc = could not find container \"4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d\": container with ID starting with 4a22cf315f01c965286e5ebdea55acbeaa7701b6df3bb906d42615f3f50b032d not found: ID does not exist" Apr 22 19:39:10.187617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.187585 2569 scope.go:117] "RemoveContainer" containerID="9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83" Apr 22 19:39:10.187914 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:10.187884 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83\": container with ID starting with 9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83 not found: ID does not exist" containerID="9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83" Apr 22 19:39:10.187977 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.187923 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83"} err="failed to get container status \"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83\": rpc error: code = NotFound desc = could not find container \"9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83\": container with ID starting with 9d03f351a3d9362237ac3db4e0f81120c0d69a06bd1a989973d02364197d8b83 not found: ID does not exist" Apr 22 19:39:10.461035 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.461011 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:39:10.596569 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596577 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596616 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596667 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596698 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596725 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khhvb\" (UniqueName: \"kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb\") pod \"1b111da3-dc21-46f2-a35d-05b4979f0e86\" (UID: \"1b111da3-dc21-46f2-a35d-05b4979f0e86\") " Apr 22 19:39:10.596993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596793 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:10.596993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.596943 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:10.597140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.597014 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.597140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.597035 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:10.597140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.597041 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.597350 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.597330 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:10.598911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.598888 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:10.598995 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.598983 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb" (OuterVolumeSpecName: "kube-api-access-khhvb") pod "1b111da3-dc21-46f2-a35d-05b4979f0e86" (UID: "1b111da3-dc21-46f2-a35d-05b4979f0e86"). InnerVolumeSpecName "kube-api-access-khhvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:10.697658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.697611 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.697658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.697649 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khhvb\" (UniqueName: \"kubernetes.io/projected/1b111da3-dc21-46f2-a35d-05b4979f0e86-kube-api-access-khhvb\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.697658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.697660 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b111da3-dc21-46f2-a35d-05b4979f0e86-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:10.697658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:10.697670 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b111da3-dc21-46f2-a35d-05b4979f0e86-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:11.111293 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.111255 2569 generic.go:358] "Generic (PLEG): container finished" podID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerID="0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61" exitCode=0 Apr 22 19:39:11.111743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.111321 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" Apr 22 19:39:11.111743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.111337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerDied","Data":"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61"} Apr 22 19:39:11.111743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.111381 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf" event={"ID":"1b111da3-dc21-46f2-a35d-05b4979f0e86","Type":"ContainerDied","Data":"0b4196f5c8d0f9fac931efb26fb8be061d4becfc68d4378205a9c34c46faef93"} Apr 22 19:39:11.111743 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.111403 2569 scope.go:117] "RemoveContainer" containerID="0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61" Apr 22 19:39:11.120312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.120293 2569 scope.go:117] "RemoveContainer" containerID="c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab" Apr 22 19:39:11.128169 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.128151 2569 scope.go:117] "RemoveContainer" containerID="ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7" Apr 22 19:39:11.134219 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.134183 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:39:11.137084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137064 2569 scope.go:117] "RemoveContainer" containerID="0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61" Apr 22 19:39:11.137417 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:11.137398 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61\": container with ID starting with 0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61 not found: ID does not exist" containerID="0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61" Apr 22 19:39:11.137466 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137426 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61"} err="failed to get container status \"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61\": rpc error: code = NotFound desc = could not find container \"0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61\": container with ID starting with 0d517bec784f210fecc8231a4a7f61a3f14a2d8eaf73cfda675de4cde4de4c61 not found: ID does not exist" Apr 22 19:39:11.137466 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137445 2569 scope.go:117] "RemoveContainer" containerID="c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab" Apr 22 19:39:11.137700 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:11.137685 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab\": container with ID starting with c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab not found: ID does not exist" containerID="c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab" Apr 22 19:39:11.137741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137704 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab"} err="failed to get container status \"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab\": rpc error: code = NotFound desc = could not find container \"c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab\": container with ID starting with c30535a1dd301bf31e78c9bce0b42a92fe660baaa8749ec6776be769bdab32ab not found: ID does not exist" Apr 22 19:39:11.137741 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137717 2569 scope.go:117] "RemoveContainer" containerID="ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7" Apr 22 19:39:11.137857 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137839 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb8r87zf"] Apr 22 19:39:11.137966 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:11.137950 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7\": container with ID starting with ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7 not found: ID does not exist" containerID="ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7" Apr 22 19:39:11.138010 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.137971 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7"} err="failed to get container status \"ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7\": rpc error: code = NotFound desc = could not find container \"ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7\": container with ID starting with ef1af51e94458430f1920bc4170c84e83de106774eafc1774a7ca0b4002f47c7 not found: ID does not exist" Apr 22 19:39:11.953158 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.953118 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" path="/var/lib/kubelet/pods/1b111da3-dc21-46f2-a35d-05b4979f0e86/volumes" Apr 22 19:39:11.953612 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:11.953598 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" path="/var/lib/kubelet/pods/e34d9e18-3ca8-4e66-a564-acd098fba7f1/volumes" Apr 22 19:39:18.621817 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.621780 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622325 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="main" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622345 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="main" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622369 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="storage-initializer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622378 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="storage-initializer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622394 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="tokenizer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622402 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="tokenizer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622432 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="storage-initializer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622440 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="storage-initializer" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622451 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="main" Apr 22 19:39:18.622476 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622460 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="main" Apr 22 19:39:18.622958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622533 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="main" Apr 22 19:39:18.622958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622546 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b111da3-dc21-46f2-a35d-05b4979f0e86" containerName="tokenizer" Apr 22 19:39:18.622958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.622559 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e34d9e18-3ca8-4e66-a564-acd098fba7f1" containerName="main" Apr 22 19:39:18.628888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.628865 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.632782 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.632757 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k6ghc\"" Apr 22 19:39:18.632912 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.632757 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 19:39:18.637757 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.637732 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:18.764770 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.764931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.764931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.764931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.764931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjjl\" (UniqueName: \"kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.764931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.764910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.865720 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.865720 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.865961 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.865961 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865906 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjjl\" (UniqueName: \"kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.865961 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.866153 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.865995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.866153 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.866084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.866269 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.866242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.866368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.866345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.868322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.868303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.868444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.868427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.875544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.875478 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjjl\" (UniqueName: \"kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7tjcf\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.940513 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.940471 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:18.960920 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.960887 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:18.966870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.966844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:18.970129 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.970079 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-j7jqz\"" Apr 22 19:39:18.975305 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:18.975275 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:19.067373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.067373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.067613 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.067613 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.067613 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95x9\" (UniqueName: \"kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.067756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.067658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.088620 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.088585 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:19.091089 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:39:19.091061 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af7a3d5_2b85_4347_b744_43599d332ee2.slice/crio-a8f6a46d767f3f9d11076e9f2cb4b3393fd0ad12e3588a2013d96b6e45d8fc83 WatchSource:0}: Error finding container a8f6a46d767f3f9d11076e9f2cb4b3393fd0ad12e3588a2013d96b6e45d8fc83: Status 404 returned error can't find the container with id a8f6a46d767f3f9d11076e9f2cb4b3393fd0ad12e3588a2013d96b6e45d8fc83 Apr 22 19:39:19.145467 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.145432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerStarted","Data":"a8f6a46d767f3f9d11076e9f2cb4b3393fd0ad12e3588a2013d96b6e45d8fc83"} Apr 22 19:39:19.168903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.168864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l95x9\" (UniqueName: \"kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.168921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.168961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.168977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.168994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.169026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.169395 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.169410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.169471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.169588 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.169550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.171609 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.171592 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.177738 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.177709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95x9\" (UniqueName: \"kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.282561 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.282531 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:19.426428 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:39:19.426391 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59b41f44_a87c_41f9_a5f8_bcff7c859162.slice/crio-4dfa85bedea2ef0a1018057f36eb9a3de0999b196f6c6e22553a6bededf01926 WatchSource:0}: Error finding container 4dfa85bedea2ef0a1018057f36eb9a3de0999b196f6c6e22553a6bededf01926: Status 404 returned error can't find the container with id 4dfa85bedea2ef0a1018057f36eb9a3de0999b196f6c6e22553a6bededf01926 Apr 22 19:39:19.426572 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:19.426438 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:20.151715 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:20.151663 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerStarted","Data":"c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3"} Apr 22 19:39:20.153516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:20.153482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerStarted","Data":"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752"} Apr 22 19:39:20.153710 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:20.153525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerStarted","Data":"4dfa85bedea2ef0a1018057f36eb9a3de0999b196f6c6e22553a6bededf01926"} Apr 22 19:39:21.159160 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:21.159118 2569 generic.go:358] "Generic (PLEG): container finished" podID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerID="3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752" exitCode=0 Apr 22 19:39:21.159641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:21.159203 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerDied","Data":"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752"} Apr 22 19:39:22.165809 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:22.165761 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerStarted","Data":"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609"} Apr 22 19:39:22.165809 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:22.165809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerStarted","Data":"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03"} Apr 22 19:39:22.166368 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:22.165872 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:22.189367 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:22.189314 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" podStartSLOduration=4.1892992079999996 podStartE2EDuration="4.189299208s" podCreationTimestamp="2026-04-22 19:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:22.187054577 +0000 UTC m=+934.835069380" watchObservedRunningTime="2026-04-22 19:39:22.189299208 +0000 UTC m=+934.837314009" Apr 22 19:39:23.469457 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:23.469422 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af7a3d5_2b85_4347_b744_43599d332ee2.slice/crio-c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:39:23.469861 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:23.469596 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af7a3d5_2b85_4347_b744_43599d332ee2.slice/crio-c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:39:24.176357 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:24.176317 2569 generic.go:358] "Generic (PLEG): container finished" podID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerID="c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3" exitCode=0 Apr 22 19:39:24.176540 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:24.176364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerDied","Data":"c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3"} Apr 22 19:39:25.181748 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.181704 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerStarted","Data":"8381acf58a0c54dfeff445ee59cc090e4f9b8e49b72c85ffd981e0c6addadb82"} Apr 22 19:39:25.203742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.203689 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" podStartSLOduration=7.203673486 podStartE2EDuration="7.203673486s" podCreationTimestamp="2026-04-22 19:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:25.201672632 +0000 UTC m=+937.849687437" watchObservedRunningTime="2026-04-22 19:39:25.203673486 +0000 UTC m=+937.851688290" Apr 22 19:39:25.262445 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.262410 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:39:25.266836 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.266808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.269538 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.269510 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 19:39:25.269697 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.269563 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-n5sjq\"" Apr 22 19:39:25.278245 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.278145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:39:25.426906 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.426864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.426906 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.426924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.427149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.427023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.427149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.427057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.427149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.427132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxr6\" (UniqueName: \"kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.427273 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.427179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.527856 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxr6\" (UniqueName: \"kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.527973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528365 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.528327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528365 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.528347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.528370 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.528444 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.528416 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.530801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.530775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.536694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.536665 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxr6\" (UniqueName: \"kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.580373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.580323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:25.721852 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:25.721826 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:39:25.723668 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:39:25.723634 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd2ba92e_a328_4a62_b5e6_b88b26ade46d.slice/crio-9ec9014ac45d595460a542e4f25da023683ea1a93e2e007b6544e23d852b7258 WatchSource:0}: Error finding container 9ec9014ac45d595460a542e4f25da023683ea1a93e2e007b6544e23d852b7258: Status 404 returned error can't find the container with id 9ec9014ac45d595460a542e4f25da023683ea1a93e2e007b6544e23d852b7258 Apr 22 19:39:26.187828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:26.187733 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerStarted","Data":"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d"} Apr 22 19:39:26.187828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:26.187798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerStarted","Data":"9ec9014ac45d595460a542e4f25da023683ea1a93e2e007b6544e23d852b7258"} Apr 22 19:39:27.197816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:27.196538 2569 generic.go:358] "Generic (PLEG): container finished" podID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerID="71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d" exitCode=0 Apr 22 19:39:27.197816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:27.196743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerDied","Data":"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d"} Apr 22 19:39:28.203205 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.203168 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerStarted","Data":"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c"} Apr 22 19:39:28.203205 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.203211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerStarted","Data":"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694"} Apr 22 19:39:28.203614 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.203335 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:28.226867 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.226794 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" podStartSLOduration=3.226775448 podStartE2EDuration="3.226775448s" podCreationTimestamp="2026-04-22 19:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:28.225468104 +0000 UTC m=+940.873482921" watchObservedRunningTime="2026-04-22 19:39:28.226775448 +0000 UTC m=+940.874790256" Apr 22 19:39:28.941177 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.941137 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:28.941389 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.941281 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:28.954861 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:28.954825 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:29.224176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:29.224130 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:29.283318 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:29.283289 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:29.283516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:29.283441 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:29.284416 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:39:29.284392 2569 logging.go:55] [core] [Channel #41 SubChannel #42]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.44:9003", ServerName: "10.133.0.44:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.44:9003: connect: connection refused" Apr 22 19:39:29.285820 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:29.285793 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:30.212645 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:30.212615 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:30.283536 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:30.283494 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.44:9003\" within 1s: context deadline exceeded" Apr 22 19:39:35.581415 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:35.581380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:35.581415 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:35.581424 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:35.584171 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:35.584143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:36.245694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:36.245664 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:39:39.283510 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:39:39.283478 2569 logging.go:55] [core] [Channel #46 SubChannel #47]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.44:9003", ServerName: "10.133.0.44:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.44:9003: connect: connection refused" Apr 22 19:39:40.283166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:40.283125 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.44:9003\" within 1s: context deadline exceeded" Apr 22 19:39:52.228717 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:52.228683 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:53.177890 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.177858 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:53.178191 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.178144 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="main" containerID="cri-o://8381acf58a0c54dfeff445ee59cc090e4f9b8e49b72c85ffd981e0c6addadb82" gracePeriod=30 Apr 22 19:39:53.190585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.190559 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:53.190844 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.190817 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" containerID="cri-o://dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03" gracePeriod=30 Apr 22 19:39:53.190957 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.190890 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="tokenizer" containerID="cri-o://a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609" gracePeriod=30 Apr 22 19:39:53.310542 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.310511 2569 generic.go:358] "Generic (PLEG): container finished" podID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerID="dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03" exitCode=0 Apr 22 19:39:53.310950 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.310578 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerDied","Data":"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03"} Apr 22 19:39:53.312484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.312458 2569 generic.go:358] "Generic (PLEG): container finished" podID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerID="8381acf58a0c54dfeff445ee59cc090e4f9b8e49b72c85ffd981e0c6addadb82" exitCode=0 Apr 22 19:39:53.312615 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.312529 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerDied","Data":"8381acf58a0c54dfeff445ee59cc090e4f9b8e49b72c85ffd981e0c6addadb82"} Apr 22 19:39:53.467593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.467571 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:53.577761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577713 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.577761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577768 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.578005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577814 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.578005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxjjl\" (UniqueName: \"kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.578005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577887 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.578005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.577927 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location\") pod \"7af7a3d5-2b85-4347-b744-43599d332ee2\" (UID: \"7af7a3d5-2b85-4347-b744-43599d332ee2\") " Apr 22 19:39:53.578233 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.578004 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home" (OuterVolumeSpecName: "home") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:53.578233 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.578066 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache" (OuterVolumeSpecName: "model-cache") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:53.578322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.578246 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:53.578322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.578266 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:53.580582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.580537 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:53.580709 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.580621 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm" (OuterVolumeSpecName: "dshm") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:53.580709 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.580690 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl" (OuterVolumeSpecName: "kube-api-access-hxjjl") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "kube-api-access-hxjjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:53.633185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.633091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7af7a3d5-2b85-4347-b744-43599d332ee2" (UID: "7af7a3d5-2b85-4347-b744-43599d332ee2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:53.679564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.679517 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:53.679564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.679557 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7af7a3d5-2b85-4347-b744-43599d332ee2-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:53.679564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.679569 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxjjl\" (UniqueName: \"kubernetes.io/projected/7af7a3d5-2b85-4347-b744-43599d332ee2-kube-api-access-hxjjl\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:53.679852 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:53.679579 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7af7a3d5-2b85-4347-b744-43599d332ee2-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.317706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.317626 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" Apr 22 19:39:54.317706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.317651 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf" event={"ID":"7af7a3d5-2b85-4347-b744-43599d332ee2","Type":"ContainerDied","Data":"a8f6a46d767f3f9d11076e9f2cb4b3393fd0ad12e3588a2013d96b6e45d8fc83"} Apr 22 19:39:54.318193 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.317712 2569 scope.go:117] "RemoveContainer" containerID="8381acf58a0c54dfeff445ee59cc090e4f9b8e49b72c85ffd981e0c6addadb82" Apr 22 19:39:54.329580 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.329551 2569 scope.go:117] "RemoveContainer" containerID="c21ea7090aefabee02e995f6cb0c18204bec2cdc5ac70ce2f63d123e51bc77a3" Apr 22 19:39:54.342498 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.342468 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:54.348381 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:54.348344 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7tjcf"] Apr 22 19:39:55.244970 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.244944 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:55.295228 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295189 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295245 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295279 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295390 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295309 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95x9\" (UniqueName: \"kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295394 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs\") pod \"59b41f44-a87c-41f9-a5f8-bcff7c859162\" (UID: \"59b41f44-a87c-41f9-a5f8-bcff7c859162\") " Apr 22 19:39:55.295564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295431 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:55.295699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295679 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.295699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295674 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:55.295790 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295717 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:55.296023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.295988 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:55.297611 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.297585 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9" (OuterVolumeSpecName: "kube-api-access-l95x9") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "kube-api-access-l95x9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:55.297715 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.297654 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "59b41f44-a87c-41f9-a5f8-bcff7c859162" (UID: "59b41f44-a87c-41f9-a5f8-bcff7c859162"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:55.324012 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.323983 2569 generic.go:358] "Generic (PLEG): container finished" podID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerID="a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609" exitCode=0 Apr 22 19:39:55.324374 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.324053 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" Apr 22 19:39:55.324374 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.324059 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerDied","Data":"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609"} Apr 22 19:39:55.324374 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.324115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2" event={"ID":"59b41f44-a87c-41f9-a5f8-bcff7c859162","Type":"ContainerDied","Data":"4dfa85bedea2ef0a1018057f36eb9a3de0999b196f6c6e22553a6bededf01926"} Apr 22 19:39:55.324374 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.324133 2569 scope.go:117] "RemoveContainer" containerID="a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609" Apr 22 19:39:55.333248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.333231 2569 scope.go:117] "RemoveContainer" containerID="dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03" Apr 22 19:39:55.340884 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.340866 2569 scope.go:117] "RemoveContainer" containerID="3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752" Apr 22 19:39:55.347198 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.347014 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:55.350763 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.350743 2569 scope.go:117] "RemoveContainer" containerID="a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609" Apr 22 19:39:55.350988 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.350963 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-68d5c5f7bppq2"] Apr 22 19:39:55.351079 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:55.351030 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609\": container with ID starting with a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609 not found: ID does not exist" containerID="a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609" Apr 22 19:39:55.351079 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.351053 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609"} err="failed to get container status \"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609\": rpc error: code = NotFound desc = could not find container \"a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609\": container with ID starting with a4af5a5ca61724e40b18851855e30cda3854f2e49b2407ed1ff20a2d6a952609 not found: ID does not exist" Apr 22 19:39:55.351079 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.351071 2569 scope.go:117] "RemoveContainer" containerID="dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03" Apr 22 19:39:55.351359 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:55.351335 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03\": container with ID starting with dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03 not found: ID does not exist" containerID="dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03" Apr 22 19:39:55.351401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.351369 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03"} err="failed to get container status \"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03\": rpc error: code = NotFound desc = could not find container \"dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03\": container with ID starting with dd93851a8c273319aebcfe6cda8caaf38cdb0d054d86c98afe07d8e107bcda03 not found: ID does not exist" Apr 22 19:39:55.351401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.351394 2569 scope.go:117] "RemoveContainer" containerID="3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752" Apr 22 19:39:55.351660 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:39:55.351644 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752\": container with ID starting with 3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752 not found: ID does not exist" containerID="3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752" Apr 22 19:39:55.351699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.351664 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752"} err="failed to get container status \"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752\": rpc error: code = NotFound desc = could not find container \"3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752\": container with ID starting with 3dc4de612aec1b67676ed0f0418deeed44de1faf3587fd4c538f488464505752 not found: ID does not exist" Apr 22 19:39:55.396373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.396278 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.396373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.396312 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59b41f44-a87c-41f9-a5f8-bcff7c859162-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.396373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.396322 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.396373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.396331 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59b41f44-a87c-41f9-a5f8-bcff7c859162-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.396373 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.396342 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l95x9\" (UniqueName: \"kubernetes.io/projected/59b41f44-a87c-41f9-a5f8-bcff7c859162-kube-api-access-l95x9\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:39:55.951667 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.951632 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" path="/var/lib/kubelet/pods/59b41f44-a87c-41f9-a5f8-bcff7c859162/volumes" Apr 22 19:39:55.952089 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:55.952076 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" path="/var/lib/kubelet/pods/7af7a3d5-2b85-4347-b744-43599d332ee2/volumes" Apr 22 19:39:57.248893 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:39:57.248860 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:40:24.512742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.512702 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513042 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="tokenizer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513054 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="tokenizer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513068 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="storage-initializer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513074 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="storage-initializer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513083 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="storage-initializer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513089 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="storage-initializer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513114 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513120 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513127 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="main" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513132 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="main" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513187 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7af7a3d5-2b85-4347-b744-43599d332ee2" containerName="main" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513196 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="tokenizer" Apr 22 19:40:24.513238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.513203 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="59b41f44-a87c-41f9-a5f8-bcff7c859162" containerName="main" Apr 22 19:40:24.516412 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.516384 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.519265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.519238 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:40:24.530529 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.530496 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:40:24.543854 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.543822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cg8\" (UniqueName: \"kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.544033 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.543866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.544033 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.543928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.544033 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.543973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.544206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.544062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.544206 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.544088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645388 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cg8\" (UniqueName: \"kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645578 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645675 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645871 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645929 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.645966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.645922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.647958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.647933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.648147 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.648128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.653577 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.653554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cg8\" (UniqueName: \"kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8\") pod \"stop-feature-test-kserve-6688bd464-4pqkd\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.829601 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.829510 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:40:24.961850 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:24.961816 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:40:24.963688 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:40:24.963656 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58087b7_5a6f_423f_abe4_895579020ee8.slice/crio-40c00edaf73fbda66ca844b8805b4d2706cd7a1efe11a0276537cf4b09ecbf46 WatchSource:0}: Error finding container 40c00edaf73fbda66ca844b8805b4d2706cd7a1efe11a0276537cf4b09ecbf46: Status 404 returned error can't find the container with id 40c00edaf73fbda66ca844b8805b4d2706cd7a1efe11a0276537cf4b09ecbf46 Apr 22 19:40:25.441898 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:25.441854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerStarted","Data":"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588"} Apr 22 19:40:25.442079 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:25.441907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerStarted","Data":"40c00edaf73fbda66ca844b8805b4d2706cd7a1efe11a0276537cf4b09ecbf46"} Apr 22 19:40:30.466174 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:30.466138 2569 generic.go:358] "Generic (PLEG): container finished" podID="a58087b7-5a6f-423f-abe4-895579020ee8" containerID="94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588" exitCode=0 Apr 22 19:40:30.466670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:30.466195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerDied","Data":"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588"} Apr 22 19:40:58.598632 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:58.598593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerStarted","Data":"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db"} Apr 22 19:40:58.623801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:40:58.623726 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podStartSLOduration=7.562497502 podStartE2EDuration="34.623703608s" podCreationTimestamp="2026-04-22 19:40:24 +0000 UTC" firstStartedPulling="2026-04-22 19:40:30.467292833 +0000 UTC m=+1003.115307618" lastFinishedPulling="2026-04-22 19:40:57.528498942 +0000 UTC m=+1030.176513724" observedRunningTime="2026-04-22 19:40:58.620536588 +0000 UTC m=+1031.268551401" watchObservedRunningTime="2026-04-22 19:40:58.623703608 +0000 UTC m=+1031.271718419" Apr 22 19:41:04.829881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:04.829830 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:41:04.829881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:04.829899 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:41:04.831548 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:04.831520 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:41:14.830123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:14.830033 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:41:24.830525 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:24.830426 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:41:34.830577 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:34.830480 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:41:44.830078 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:44.830031 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:41:54.829978 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:41:54.829931 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:42:00.970143 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:00.970074 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:42:00.970746 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:00.970541 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="main" containerID="cri-o://f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694" gracePeriod=30 Apr 22 19:42:00.970746 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:00.970613 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="tokenizer" containerID="cri-o://53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c" gracePeriod=30 Apr 22 19:42:01.864340 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:01.864299 2569 generic.go:358] "Generic (PLEG): container finished" podID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerID="f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694" exitCode=0 Apr 22 19:42:01.864562 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:01.864388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerDied","Data":"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694"} Apr 22 19:42:02.326155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.326125 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:42:02.446858 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446827 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446883 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446928 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446949 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446970 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447042 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.446996 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpxr6\" (UniqueName: \"kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6\") pod \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\" (UID: \"fd2ba92e-a328-4a62-b5e6-b88b26ade46d\") " Apr 22 19:42:02.447361 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.447169 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:02.447419 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.447366 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:02.447499 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.447474 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:02.447984 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.447959 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:02.449506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.449481 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:42:02.449506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.449495 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6" (OuterVolumeSpecName: "kube-api-access-lpxr6") pod "fd2ba92e-a328-4a62-b5e6-b88b26ade46d" (UID: "fd2ba92e-a328-4a62-b5e6-b88b26ade46d"). InnerVolumeSpecName "kube-api-access-lpxr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:42:02.548425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548376 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.548425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548421 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.548425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548433 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.548425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548443 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.548708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548453 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.548708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.548463 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lpxr6\" (UniqueName: \"kubernetes.io/projected/fd2ba92e-a328-4a62-b5e6-b88b26ade46d-kube-api-access-lpxr6\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:42:02.870726 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.870633 2569 generic.go:358] "Generic (PLEG): container finished" podID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerID="53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c" exitCode=0 Apr 22 19:42:02.870726 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.870715 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" Apr 22 19:42:02.870936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.870758 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerDied","Data":"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c"} Apr 22 19:42:02.870936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.870789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll" event={"ID":"fd2ba92e-a328-4a62-b5e6-b88b26ade46d","Type":"ContainerDied","Data":"9ec9014ac45d595460a542e4f25da023683ea1a93e2e007b6544e23d852b7258"} Apr 22 19:42:02.870936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.870805 2569 scope.go:117] "RemoveContainer" containerID="53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c" Apr 22 19:42:02.880678 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.880655 2569 scope.go:117] "RemoveContainer" containerID="f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694" Apr 22 19:42:02.891440 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.891413 2569 scope.go:117] "RemoveContainer" containerID="71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d" Apr 22 19:42:02.896471 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.896439 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:42:02.901537 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.901404 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schesffll"] Apr 22 19:42:02.901623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.901599 2569 scope.go:117] "RemoveContainer" containerID="53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c" Apr 22 19:42:02.901966 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:42:02.901948 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c\": container with ID starting with 53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c not found: ID does not exist" containerID="53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c" Apr 22 19:42:02.902025 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.901979 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c"} err="failed to get container status \"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c\": rpc error: code = NotFound desc = could not find container \"53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c\": container with ID starting with 53a22bf96312256c75515289e9fd1ab4c5fd659fc3e04ad555f3d4814226fa7c not found: ID does not exist" Apr 22 19:42:02.902025 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.901996 2569 scope.go:117] "RemoveContainer" containerID="f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694" Apr 22 19:42:02.902303 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:42:02.902285 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694\": container with ID starting with f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694 not found: ID does not exist" containerID="f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694" Apr 22 19:42:02.902365 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.902309 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694"} err="failed to get container status \"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694\": rpc error: code = NotFound desc = could not find container \"f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694\": container with ID starting with f13db1522c5a8ed6c83d6c6269e55e2f26160f205aee87072fd8035977dba694 not found: ID does not exist" Apr 22 19:42:02.902365 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.902325 2569 scope.go:117] "RemoveContainer" containerID="71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d" Apr 22 19:42:02.902544 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:42:02.902528 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d\": container with ID starting with 71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d not found: ID does not exist" containerID="71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d" Apr 22 19:42:02.902590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:02.902549 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d"} err="failed to get container status \"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d\": rpc error: code = NotFound desc = could not find container \"71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d\": container with ID starting with 71fc2b59c11af40a473edb4f29ca50ec353263b39776f264fd71aa6ad24e186d not found: ID does not exist" Apr 22 19:42:03.924477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924445 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924829 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="tokenizer" Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924842 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="tokenizer" Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924861 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="storage-initializer" Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924867 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="storage-initializer" Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924873 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="main" Apr 22 19:42:03.924881 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924878 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="main" Apr 22 19:42:03.925068 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924933 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="main" Apr 22 19:42:03.925068 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.924945 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" containerName="tokenizer" Apr 22 19:42:03.929750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.929725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:03.932392 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.932368 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 19:42:03.939495 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.939466 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:42:03.953518 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:03.953482 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2ba92e-a328-4a62-b5e6-b88b26ade46d" path="/var/lib/kubelet/pods/fd2ba92e-a328-4a62-b5e6-b88b26ade46d/volumes" Apr 22 19:42:04.062089 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.062286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.062286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062186 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.062286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.062286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.062434 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.062305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r25v\" (UniqueName: \"kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.124853 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.124813 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:42:04.128909 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.128879 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.131870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.131823 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-xlsgx\"" Apr 22 19:42:04.139370 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.139337 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:42:04.163462 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163462 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163463 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r25v\" (UniqueName: \"kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.163958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.164019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.163980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.164166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.164143 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.166351 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.166323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.166492 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.166465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.178298 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.178212 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r25v\" (UniqueName: \"kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v\") pod \"custom-route-timeout-test-kserve-746c5b55d8-jg9cs\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.241851 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.241813 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:04.265082 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.265421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.265421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.265421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.265421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.265421 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.265290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zws\" (UniqueName: \"kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.366728 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.366931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zws\" (UniqueName: \"kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.366931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367059 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367059 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367059 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.366998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.367213 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367388 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.367360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367495 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.367472 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.367683 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.367665 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.370428 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.370388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.380539 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.380511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zws\" (UniqueName: \"kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws\") pod \"custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.391662 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.391634 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:42:04.394237 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:42:04.394195 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f71ed3c_cc66_46d4_b42f_001993a466c5.slice/crio-18319e5311719a2f2b1cd2e3af84fa8d9edbe2c2dd3411d53efccdad382773ae WatchSource:0}: Error finding container 18319e5311719a2f2b1cd2e3af84fa8d9edbe2c2dd3411d53efccdad382773ae: Status 404 returned error can't find the container with id 18319e5311719a2f2b1cd2e3af84fa8d9edbe2c2dd3411d53efccdad382773ae Apr 22 19:42:04.396142 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.396123 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:42:04.442142 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.442088 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:04.603569 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.603523 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:42:04.604304 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:42:04.604271 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef53118_df8e_466f_b9b8_3c3d1b4abb3b.slice/crio-47878127e5cbd95886188fcea80fc81b9c4391f788cf3800655d55b7996927e5 WatchSource:0}: Error finding container 47878127e5cbd95886188fcea80fc81b9c4391f788cf3800655d55b7996927e5: Status 404 returned error can't find the container with id 47878127e5cbd95886188fcea80fc81b9c4391f788cf3800655d55b7996927e5 Apr 22 19:42:04.830604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.830537 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:42:04.882736 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.882698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerStarted","Data":"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31"} Apr 22 19:42:04.882964 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.882768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerStarted","Data":"47878127e5cbd95886188fcea80fc81b9c4391f788cf3800655d55b7996927e5"} Apr 22 19:42:04.884485 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.884448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerStarted","Data":"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0"} Apr 22 19:42:04.884635 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:04.884494 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerStarted","Data":"18319e5311719a2f2b1cd2e3af84fa8d9edbe2c2dd3411d53efccdad382773ae"} Apr 22 19:42:05.891873 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:05.891777 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerID="8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31" exitCode=0 Apr 22 19:42:05.892337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:05.891884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerDied","Data":"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31"} Apr 22 19:42:06.898059 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:06.898025 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerStarted","Data":"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317"} Apr 22 19:42:06.898559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:06.898065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerStarted","Data":"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a"} Apr 22 19:42:06.898559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:06.898260 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:06.923172 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:06.923108 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" podStartSLOduration=2.923070997 podStartE2EDuration="2.923070997s" podCreationTimestamp="2026-04-22 19:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:06.920596392 +0000 UTC m=+1099.568611197" watchObservedRunningTime="2026-04-22 19:42:06.923070997 +0000 UTC m=+1099.571085800" Apr 22 19:42:09.912197 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:09.912162 2569 generic.go:358] "Generic (PLEG): container finished" podID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerID="1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0" exitCode=0 Apr 22 19:42:09.912544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:09.912220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerDied","Data":"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0"} Apr 22 19:42:10.919745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:10.919704 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerStarted","Data":"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316"} Apr 22 19:42:10.947048 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:10.946987 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podStartSLOduration=7.946972111 podStartE2EDuration="7.946972111s" podCreationTimestamp="2026-04-22 19:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:10.94295799 +0000 UTC m=+1103.590972794" watchObservedRunningTime="2026-04-22 19:42:10.946972111 +0000 UTC m=+1103.594986915" Apr 22 19:42:14.242341 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.242298 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:14.242827 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.242377 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:42:14.243893 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.243861 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:42:14.443584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.443545 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:14.445248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.445215 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:14.447065 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.447039 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:14.830442 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.830394 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:42:14.938882 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:14.938840 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:24.242707 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:24.242665 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:42:24.830837 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:24.830738 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 19:42:34.242990 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:34.242941 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:42:34.839669 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:34.839636 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:42:34.847986 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:34.847959 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:42:35.797991 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:35.797952 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:42:36.042606 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:36.042565 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" containerID="cri-o://8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db" gracePeriod=30 Apr 22 19:42:36.948338 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:36.948303 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:42:44.242721 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:44.242674 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:42:54.243183 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:42:54.243138 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:43:04.243250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:04.243148 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:43:06.421436 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.421408 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6688bd464-4pqkd_a58087b7-5a6f-423f-abe4-895579020ee8/main/0.log" Apr 22 19:43:06.421791 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.421774 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:43:06.555104 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555068 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555174 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555202 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555227 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555258 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2cg8\" (UniqueName: \"kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555313 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555306 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs\") pod \"a58087b7-5a6f-423f-abe4-895579020ee8\" (UID: \"a58087b7-5a6f-423f-abe4-895579020ee8\") " Apr 22 19:43:06.555588 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555490 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache" (OuterVolumeSpecName: "model-cache") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:06.555588 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.555570 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home" (OuterVolumeSpecName: "home") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:06.557604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.557572 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8" (OuterVolumeSpecName: "kube-api-access-l2cg8") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "kube-api-access-l2cg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:43:06.557756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.557718 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm" (OuterVolumeSpecName: "dshm") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:06.558044 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.558018 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:43:06.614782 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.614694 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a58087b7-5a6f-423f-abe4-895579020ee8" (UID: "a58087b7-5a6f-423f-abe4-895579020ee8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:06.656944 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.656900 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:06.656944 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.656943 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:06.657179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.656960 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:06.657179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.656975 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a58087b7-5a6f-423f-abe4-895579020ee8-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:06.657179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.656990 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2cg8\" (UniqueName: \"kubernetes.io/projected/a58087b7-5a6f-423f-abe4-895579020ee8-kube-api-access-l2cg8\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:06.657179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:06.657005 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a58087b7-5a6f-423f-abe4-895579020ee8-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:07.165431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165403 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6688bd464-4pqkd_a58087b7-5a6f-423f-abe4-895579020ee8/main/0.log" Apr 22 19:43:07.165811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165785 2569 generic.go:358] "Generic (PLEG): container finished" podID="a58087b7-5a6f-423f-abe4-895579020ee8" containerID="8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db" exitCode=137 Apr 22 19:43:07.165888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165852 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" Apr 22 19:43:07.165888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerDied","Data":"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db"} Apr 22 19:43:07.165956 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165912 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd" event={"ID":"a58087b7-5a6f-423f-abe4-895579020ee8","Type":"ContainerDied","Data":"40c00edaf73fbda66ca844b8805b4d2706cd7a1efe11a0276537cf4b09ecbf46"} Apr 22 19:43:07.165956 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.165930 2569 scope.go:117] "RemoveContainer" containerID="8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db" Apr 22 19:43:07.189087 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.189053 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:43:07.194069 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.194043 2569 scope.go:117] "RemoveContainer" containerID="94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588" Apr 22 19:43:07.194371 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.194350 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-4pqkd"] Apr 22 19:43:07.257642 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.257617 2569 scope.go:117] "RemoveContainer" containerID="8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db" Apr 22 19:43:07.257960 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:43:07.257933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db\": container with ID starting with 8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db not found: ID does not exist" containerID="8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db" Apr 22 19:43:07.258062 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.257976 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db"} err="failed to get container status \"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db\": rpc error: code = NotFound desc = could not find container \"8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db\": container with ID starting with 8b5c47a0fa344ad1b7f75972e738c1ee031b40157d8cd6ac5ddd4c68118ba5db not found: ID does not exist" Apr 22 19:43:07.258062 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.258002 2569 scope.go:117] "RemoveContainer" containerID="94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588" Apr 22 19:43:07.258343 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:43:07.258324 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588\": container with ID starting with 94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588 not found: ID does not exist" containerID="94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588" Apr 22 19:43:07.258394 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.258351 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588"} err="failed to get container status \"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588\": rpc error: code = NotFound desc = could not find container \"94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588\": container with ID starting with 94017b0642b13d75ce3e96553d4217f81693f80de4e44d53b2be9e283abe0588 not found: ID does not exist" Apr 22 19:43:07.952006 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:07.951973 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" path="/var/lib/kubelet/pods/a58087b7-5a6f-423f-abe4-895579020ee8/volumes" Apr 22 19:43:14.242868 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:14.242822 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:43:24.243070 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:24.243022 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:43:34.242694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:34.242646 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 19:43:44.252977 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:44.252934 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:43:44.260839 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:44.260811 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:43:47.949878 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:47.949849 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:43:47.951460 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:47.951429 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:43:49.803912 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:49.803870 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:43:49.804329 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:49.804267 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" containerID="cri-o://a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316" gracePeriod=30 Apr 22 19:43:49.810920 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:49.810890 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:43:49.811350 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:49.811318 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="main" containerID="cri-o://b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a" gracePeriod=30 Apr 22 19:43:49.811500 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:49.811325 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="tokenizer" containerID="cri-o://91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317" gracePeriod=30 Apr 22 19:43:50.346579 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:50.346540 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerID="b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a" exitCode=0 Apr 22 19:43:50.346770 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:50.346617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerDied","Data":"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a"} Apr 22 19:43:51.158940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.158910 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:43:51.246536 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246494 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246571 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zws\" (UniqueName: \"kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246663 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246696 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246932 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246757 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location\") pod \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\" (UID: \"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b\") " Apr 22 19:43:51.246932 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.246896 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:51.247046 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.247022 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.247140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.247034 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:51.247140 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.247047 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:51.247505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.247480 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:51.248928 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.248906 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:43:51.249029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.248924 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws" (OuterVolumeSpecName: "kube-api-access-x7zws") pod "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" (UID: "2ef53118-df8e-466f-b9b8-3c3d1b4abb3b"). InnerVolumeSpecName "kube-api-access-x7zws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:43:51.347854 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.347820 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.347854 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.347855 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.348060 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.347870 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.348060 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.347879 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7zws\" (UniqueName: \"kubernetes.io/projected/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-kube-api-access-x7zws\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.348060 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.347889 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:43:51.352791 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.352755 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerID="91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317" exitCode=0 Apr 22 19:43:51.352966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.352850 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" Apr 22 19:43:51.352966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.352844 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerDied","Data":"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317"} Apr 22 19:43:51.352966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.352900 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t" event={"ID":"2ef53118-df8e-466f-b9b8-3c3d1b4abb3b","Type":"ContainerDied","Data":"47878127e5cbd95886188fcea80fc81b9c4391f788cf3800655d55b7996927e5"} Apr 22 19:43:51.352966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.352923 2569 scope.go:117] "RemoveContainer" containerID="91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317" Apr 22 19:43:51.362562 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.362537 2569 scope.go:117] "RemoveContainer" containerID="b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a" Apr 22 19:43:51.370463 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.370440 2569 scope.go:117] "RemoveContainer" containerID="8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31" Apr 22 19:43:51.375260 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.375232 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:43:51.378566 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.378531 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-95867fb57fz7t"] Apr 22 19:43:51.380238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.380218 2569 scope.go:117] "RemoveContainer" containerID="91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317" Apr 22 19:43:51.380509 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:43:51.380489 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317\": container with ID starting with 91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317 not found: ID does not exist" containerID="91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317" Apr 22 19:43:51.380567 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.380520 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317"} err="failed to get container status \"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317\": rpc error: code = NotFound desc = could not find container \"91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317\": container with ID starting with 91d298fdae33c462f680260bf4ece2535aab221a79c470842cc9481b6577c317 not found: ID does not exist" Apr 22 19:43:51.380567 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.380541 2569 scope.go:117] "RemoveContainer" containerID="b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a" Apr 22 19:43:51.380788 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:43:51.380773 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a\": container with ID starting with b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a not found: ID does not exist" containerID="b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a" Apr 22 19:43:51.380834 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.380793 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a"} err="failed to get container status \"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a\": rpc error: code = NotFound desc = could not find container \"b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a\": container with ID starting with b332d90f32c476fe5f0d78e1a4525264843f73dc5929cd545fb7240c2f67616a not found: ID does not exist" Apr 22 19:43:51.380834 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.380813 2569 scope.go:117] "RemoveContainer" containerID="8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31" Apr 22 19:43:51.381027 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:43:51.381011 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31\": container with ID starting with 8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31 not found: ID does not exist" containerID="8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31" Apr 22 19:43:51.381074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.381033 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31"} err="failed to get container status \"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31\": rpc error: code = NotFound desc = could not find container \"8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31\": container with ID starting with 8410ccd6946fb5f2a6c8a4ab66cf95a720ac46ca915a74068e0af67228552a31 not found: ID does not exist" Apr 22 19:43:51.953299 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:43:51.953256 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" path="/var/lib/kubelet/pods/2ef53118-df8e-466f-b9b8-3c3d1b4abb3b/volumes" Apr 22 19:44:09.921207 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921171 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921537 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="tokenizer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921548 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="tokenizer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921557 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921564 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921572 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="storage-initializer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921579 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="storage-initializer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921598 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921603 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921613 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="storage-initializer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921618 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="storage-initializer" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921671 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a58087b7-5a6f-423f-abe4-895579020ee8" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921679 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="main" Apr 22 19:44:09.921750 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.921687 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ef53118-df8e-466f-b9b8-3c3d1b4abb3b" containerName="tokenizer" Apr 22 19:44:09.924945 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.924925 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:09.928276 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.928256 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 19:44:09.933656 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:09.933631 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:44:10.018709 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.018892 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5t2\" (UniqueName: \"kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.018892 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.018892 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.018892 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.019035 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.018911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120301 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5t2\" (UniqueName: \"kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120470 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120760 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120858 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.120858 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.120780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.122706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.122681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.123166 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.123144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.128285 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.128258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5t2\" (UniqueName: \"kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2\") pod \"router-with-refs-test-kserve-8db94848-d6qh7\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.236542 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.236496 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:10.375706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.375678 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:44:10.377260 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:44:10.377232 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda009586b_bd98_485d_9aea_b6e5a86349fd.slice/crio-6e499aaeadd00178568b4cf24bdd4c08f734b61178143371c4a6b814fe5e5ab4 WatchSource:0}: Error finding container 6e499aaeadd00178568b4cf24bdd4c08f734b61178143371c4a6b814fe5e5ab4: Status 404 returned error can't find the container with id 6e499aaeadd00178568b4cf24bdd4c08f734b61178143371c4a6b814fe5e5ab4 Apr 22 19:44:10.427164 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.427129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerStarted","Data":"6e499aaeadd00178568b4cf24bdd4c08f734b61178143371c4a6b814fe5e5ab4"} Apr 22 19:44:10.595993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.595903 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:44:10.600688 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.600659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.604742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.604718 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-hpjpx\"" Apr 22 19:44:10.629482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.629449 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:44:10.725402 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.725600 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725421 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.725600 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.725600 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655h5\" (UniqueName: \"kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.725733 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.725733 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.725645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.826858 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.826819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827051 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-655h5\" (UniqueName: \"kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827205 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827271 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827466 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827529 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827472 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.827589 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.827545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.829840 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.829816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.835484 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.835459 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-655h5\" (UniqueName: \"kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5\") pod \"router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:10.909910 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:10.909828 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:11.058431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:11.058399 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:44:11.060849 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:44:11.060811 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb41ec7_8919_4a86_a9f4_90b7f4f6b805.slice/crio-0f871e76b5b0bfd8c779a33625e863ccad78a19f92e85de1cabcc46548cb9534 WatchSource:0}: Error finding container 0f871e76b5b0bfd8c779a33625e863ccad78a19f92e85de1cabcc46548cb9534: Status 404 returned error can't find the container with id 0f871e76b5b0bfd8c779a33625e863ccad78a19f92e85de1cabcc46548cb9534 Apr 22 19:44:11.433023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:11.432979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerStarted","Data":"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7"} Apr 22 19:44:11.433023 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:11.433021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerStarted","Data":"0f871e76b5b0bfd8c779a33625e863ccad78a19f92e85de1cabcc46548cb9534"} Apr 22 19:44:11.434549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:11.434519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerStarted","Data":"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1"} Apr 22 19:44:12.440074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:12.440030 2569 generic.go:358] "Generic (PLEG): container finished" podID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerID="fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7" exitCode=0 Apr 22 19:44:12.440506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:12.440138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerDied","Data":"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7"} Apr 22 19:44:13.446756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:13.446718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerStarted","Data":"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78"} Apr 22 19:44:13.447202 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:13.446762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerStarted","Data":"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986"} Apr 22 19:44:13.447202 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:13.446865 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:13.474443 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:13.474364 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" podStartSLOduration=3.474286168 podStartE2EDuration="3.474286168s" podCreationTimestamp="2026-04-22 19:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:13.469045036 +0000 UTC m=+1226.117059840" watchObservedRunningTime="2026-04-22 19:44:13.474286168 +0000 UTC m=+1226.122300973" Apr 22 19:44:15.456956 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:15.456918 2569 generic.go:358] "Generic (PLEG): container finished" podID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerID="d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1" exitCode=0 Apr 22 19:44:15.457585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:15.456975 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerDied","Data":"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1"} Apr 22 19:44:16.462941 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:16.462895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerStarted","Data":"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029"} Apr 22 19:44:16.487666 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:16.487598 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podStartSLOduration=7.487578879 podStartE2EDuration="7.487578879s" podCreationTimestamp="2026-04-22 19:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:16.483551568 +0000 UTC m=+1229.131566372" watchObservedRunningTime="2026-04-22 19:44:16.487578879 +0000 UTC m=+1229.135593686" Apr 22 19:44:20.066605 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.066526 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-746c5b55d8-jg9cs_9f71ed3c-cc66-46d4-b42f-001993a466c5/main/0.log" Apr 22 19:44:20.067025 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.066884 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:44:20.212477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212430 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212495 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r25v\" (UniqueName: \"kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212652 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212548 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212779 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212657 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212779 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212735 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212778 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location\") pod \"9f71ed3c-cc66-46d4-b42f-001993a466c5\" (UID: \"9f71ed3c-cc66-46d4-b42f-001993a466c5\") " Apr 22 19:44:20.212870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212804 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home" (OuterVolumeSpecName: "home") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:20.213016 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.212992 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache" (OuterVolumeSpecName: "model-cache") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:20.213238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.213175 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.213238 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.213202 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.214954 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.214927 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v" (OuterVolumeSpecName: "kube-api-access-9r25v") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "kube-api-access-9r25v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:44:20.215363 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.215335 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:44:20.215875 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.215846 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm" (OuterVolumeSpecName: "dshm") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:20.237192 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.237148 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:20.237192 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.237186 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:44:20.239158 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.239117 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:44:20.270019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.269958 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f71ed3c-cc66-46d4-b42f-001993a466c5" (UID: "9f71ed3c-cc66-46d4-b42f-001993a466c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:20.314501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.314457 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.314501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.314490 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f71ed3c-cc66-46d4-b42f-001993a466c5-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.314501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.314502 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r25v\" (UniqueName: \"kubernetes.io/projected/9f71ed3c-cc66-46d4-b42f-001993a466c5-kube-api-access-9r25v\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.314501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.314511 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f71ed3c-cc66-46d4-b42f-001993a466c5-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:44:20.481583 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481557 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-746c5b55d8-jg9cs_9f71ed3c-cc66-46d4-b42f-001993a466c5/main/0.log" Apr 22 19:44:20.481885 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481858 2569 generic.go:358] "Generic (PLEG): container finished" podID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerID="a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316" exitCode=137 Apr 22 19:44:20.482024 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerDied","Data":"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316"} Apr 22 19:44:20.482024 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481931 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" Apr 22 19:44:20.482024 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481953 2569 scope.go:117] "RemoveContainer" containerID="a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316" Apr 22 19:44:20.482220 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.481941 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs" event={"ID":"9f71ed3c-cc66-46d4-b42f-001993a466c5","Type":"ContainerDied","Data":"18319e5311719a2f2b1cd2e3af84fa8d9edbe2c2dd3411d53efccdad382773ae"} Apr 22 19:44:20.507617 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.507589 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:44:20.511235 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.511206 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-746c5b55d8-jg9cs"] Apr 22 19:44:20.512606 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.512591 2569 scope.go:117] "RemoveContainer" containerID="1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0" Apr 22 19:44:20.581575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.581547 2569 scope.go:117] "RemoveContainer" containerID="a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316" Apr 22 19:44:20.581938 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:44:20.581914 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316\": container with ID starting with a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316 not found: ID does not exist" containerID="a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316" Apr 22 19:44:20.581999 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.581948 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316"} err="failed to get container status \"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316\": rpc error: code = NotFound desc = could not find container \"a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316\": container with ID starting with a690a5d67a58264098702b2a53bdab0fcb36db5a87b3b08fb6183bdceedd8316 not found: ID does not exist" Apr 22 19:44:20.581999 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.581967 2569 scope.go:117] "RemoveContainer" containerID="1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0" Apr 22 19:44:20.582275 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:44:20.582252 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0\": container with ID starting with 1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0 not found: ID does not exist" containerID="1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0" Apr 22 19:44:20.582361 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.582284 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0"} err="failed to get container status \"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0\": rpc error: code = NotFound desc = could not find container \"1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0\": container with ID starting with 1660b66870489fef330b9927ceed3f39346b940f3415abf9c4caa0c5662390d0 not found: ID does not exist" Apr 22 19:44:20.909975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.909884 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:20.909975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.909932 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:20.912965 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:20.912936 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:21.489060 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:21.489025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:21.952674 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:21.952642 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" path="/var/lib/kubelet/pods/9f71ed3c-cc66-46d4-b42f-001993a466c5/volumes" Apr 22 19:44:30.237298 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:30.237256 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:44:40.237683 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:40.237588 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:44:42.495182 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:42.495145 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:44:50.237630 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:44:50.237582 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:00.237836 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:00.237791 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:10.237986 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:10.237934 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:20.237859 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:20.237805 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:30.237474 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:30.237424 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:40.237950 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:40.237901 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 22 19:45:50.247196 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:50.247163 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:45:50.255179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:50.255156 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:45:56.013631 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.013590 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:45:56.014176 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.013876 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" containerID="cri-o://a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029" gracePeriod=30 Apr 22 19:45:56.022745 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.022708 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:45:56.023162 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.023131 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="tokenizer" containerID="cri-o://c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78" gracePeriod=30 Apr 22 19:45:56.023258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.023131 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="main" containerID="cri-o://fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986" gracePeriod=30 Apr 22 19:45:56.876662 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.876625 2569 generic.go:358] "Generic (PLEG): container finished" podID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerID="fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986" exitCode=0 Apr 22 19:45:56.876662 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:56.876672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerDied","Data":"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986"} Apr 22 19:45:57.387985 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.387960 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:45:57.472357 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472317 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472379 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-655h5\" (UniqueName: \"kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472411 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472432 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472554 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472470 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs\") pod \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\" (UID: \"abb41ec7-8919-4a86-a9f4-90b7f4f6b805\") " Apr 22 19:45:57.472773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472680 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:57.472773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472699 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:57.472921 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.472899 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:57.473302 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.473242 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:57.474727 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.474706 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:45:57.474809 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.474769 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5" (OuterVolumeSpecName: "kube-api-access-655h5") pod "abb41ec7-8919-4a86-a9f4-90b7f4f6b805" (UID: "abb41ec7-8919-4a86-a9f4-90b7f4f6b805"). InnerVolumeSpecName "kube-api-access-655h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:45:57.573787 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573744 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.573787 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573780 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-655h5\" (UniqueName: \"kubernetes.io/projected/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kube-api-access-655h5\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.573787 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573793 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.574036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573806 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.574036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573818 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.574036 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.573832 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abb41ec7-8919-4a86-a9f4-90b7f4f6b805-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:45:57.883220 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.883131 2569 generic.go:358] "Generic (PLEG): container finished" podID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerID="c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78" exitCode=0 Apr 22 19:45:57.883220 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.883174 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerDied","Data":"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78"} Apr 22 19:45:57.883220 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.883201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" event={"ID":"abb41ec7-8919-4a86-a9f4-90b7f4f6b805","Type":"ContainerDied","Data":"0f871e76b5b0bfd8c779a33625e863ccad78a19f92e85de1cabcc46548cb9534"} Apr 22 19:45:57.883220 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.883213 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc" Apr 22 19:45:57.883474 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.883217 2569 scope.go:117] "RemoveContainer" containerID="c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78" Apr 22 19:45:57.892716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.892689 2569 scope.go:117] "RemoveContainer" containerID="fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986" Apr 22 19:45:57.901756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.901734 2569 scope.go:117] "RemoveContainer" containerID="fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7" Apr 22 19:45:57.905508 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.905476 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:45:57.911624 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.911454 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6777dff47d-pjzdc"] Apr 22 19:45:57.911706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.911632 2569 scope.go:117] "RemoveContainer" containerID="c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78" Apr 22 19:45:57.911995 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:45:57.911976 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78\": container with ID starting with c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78 not found: ID does not exist" containerID="c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78" Apr 22 19:45:57.912061 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.912008 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78"} err="failed to get container status \"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78\": rpc error: code = NotFound desc = could not find container \"c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78\": container with ID starting with c67d742413ffbdd000b5173af11e7069591ecebe4100f67ce09693146b2aeb78 not found: ID does not exist" Apr 22 19:45:57.912061 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.912035 2569 scope.go:117] "RemoveContainer" containerID="fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986" Apr 22 19:45:57.912433 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:45:57.912404 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986\": container with ID starting with fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986 not found: ID does not exist" containerID="fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986" Apr 22 19:45:57.912506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.912431 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986"} err="failed to get container status \"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986\": rpc error: code = NotFound desc = could not find container \"fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986\": container with ID starting with fd7dbbbcf9d843aac188ee320d50f402c13832b2cd25d09392b442e7764b7986 not found: ID does not exist" Apr 22 19:45:57.912506 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.912448 2569 scope.go:117] "RemoveContainer" containerID="fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7" Apr 22 19:45:57.912711 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:45:57.912691 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7\": container with ID starting with fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7 not found: ID does not exist" containerID="fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7" Apr 22 19:45:57.912761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.912719 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7"} err="failed to get container status \"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7\": rpc error: code = NotFound desc = could not find container \"fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7\": container with ID starting with fc36a1f3ad81aa4d021871f0abd01100f1b4dcea85af5fe6aaf93477db8565f7 not found: ID does not exist" Apr 22 19:45:57.952209 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:45:57.952171 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" path="/var/lib/kubelet/pods/abb41ec7-8919-4a86-a9f4-90b7f4f6b805/volumes" Apr 22 19:46:08.152213 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152131 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152473 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152483 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152496 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="storage-initializer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152502 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="storage-initializer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152512 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="storage-initializer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152518 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="storage-initializer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152526 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="tokenizer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152531 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="tokenizer" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152541 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="main" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152546 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="main" Apr 22 19:46:08.152593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152595 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="main" Apr 22 19:46:08.152934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152606 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="abb41ec7-8919-4a86-a9f4-90b7f4f6b805" containerName="tokenizer" Apr 22 19:46:08.152934 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.152614 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f71ed3c-cc66-46d4-b42f-001993a466c5" containerName="main" Apr 22 19:46:08.157650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.157618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.160641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.160610 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-9bl2j\"" Apr 22 19:46:08.160807 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.160677 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 19:46:08.165587 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.165552 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:46:08.266949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.266906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.267203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.266962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.267203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.267045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.267203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.267084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.267203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.267143 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.267203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.267179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfxzf\" (UniqueName: \"kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368291 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxzf\" (UniqueName: \"kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368777 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368777 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.368864 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.368798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.370886 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.370862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.371171 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.371150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.376473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.376444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxzf\" (UniqueName: \"kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.473473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.473431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:08.613665 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.613624 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:46:08.616918 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:46:08.616873 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod571c0e08_d9fe_42ca_9c33_6e873d307ce4.slice/crio-4b633d5cc8d63575b00e94f1124371b7faf2d23088bbbdf3a318002ffb2f3dd0 WatchSource:0}: Error finding container 4b633d5cc8d63575b00e94f1124371b7faf2d23088bbbdf3a318002ffb2f3dd0: Status 404 returned error can't find the container with id 4b633d5cc8d63575b00e94f1124371b7faf2d23088bbbdf3a318002ffb2f3dd0 Apr 22 19:46:08.927371 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:08.927275 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerStarted","Data":"4b633d5cc8d63575b00e94f1124371b7faf2d23088bbbdf3a318002ffb2f3dd0"} Apr 22 19:46:09.935940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:09.935846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerStarted","Data":"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939"} Apr 22 19:46:09.936346 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:09.935996 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:10.941662 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:10.941605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerStarted","Data":"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a"} Apr 22 19:46:14.958051 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:14.958016 2569 generic.go:358] "Generic (PLEG): container finished" podID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerID="d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a" exitCode=0 Apr 22 19:46:14.958569 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:14.958118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerDied","Data":"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a"} Apr 22 19:46:15.963756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:15.963720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerStarted","Data":"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8"} Apr 22 19:46:15.987954 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:15.987878 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podStartSLOduration=7.143391462 podStartE2EDuration="7.987858154s" podCreationTimestamp="2026-04-22 19:46:08 +0000 UTC" firstStartedPulling="2026-04-22 19:46:08.62198283 +0000 UTC m=+1341.269997616" lastFinishedPulling="2026-04-22 19:46:09.466449524 +0000 UTC m=+1342.114464308" observedRunningTime="2026-04-22 19:46:15.985983926 +0000 UTC m=+1348.633998721" watchObservedRunningTime="2026-04-22 19:46:15.987858154 +0000 UTC m=+1348.635872958" Apr 22 19:46:18.474112 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:18.474055 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:18.474112 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:18.474114 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:18.475609 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:18.475569 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:46:26.304318 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.304291 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-8db94848-d6qh7_a009586b-bd98-485d-9aea-b6e5a86349fd/main/0.log" Apr 22 19:46:26.304696 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.304675 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:46:26.437710 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437676 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.437903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437733 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.437903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437781 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.437903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437819 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.437903 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.438149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.437953 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm5t2\" (UniqueName: \"kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2\") pod \"a009586b-bd98-485d-9aea-b6e5a86349fd\" (UID: \"a009586b-bd98-485d-9aea-b6e5a86349fd\") " Apr 22 19:46:26.438149 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.438091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home" (OuterVolumeSpecName: "home") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:26.438251 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.438184 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache" (OuterVolumeSpecName: "model-cache") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:26.438304 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.438256 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:26.440202 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.440167 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2" (OuterVolumeSpecName: "kube-api-access-mm5t2") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "kube-api-access-mm5t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:46:26.440333 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.440310 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:46:26.440378 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.440324 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm" (OuterVolumeSpecName: "dshm") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:26.521446 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.521399 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a009586b-bd98-485d-9aea-b6e5a86349fd" (UID: "a009586b-bd98-485d-9aea-b6e5a86349fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:26.539532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.539447 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a009586b-bd98-485d-9aea-b6e5a86349fd-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:26.539532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.539482 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:26.539532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.539495 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:26.539532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.539508 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mm5t2\" (UniqueName: \"kubernetes.io/projected/a009586b-bd98-485d-9aea-b6e5a86349fd-kube-api-access-mm5t2\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:26.539532 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:26.539520 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a009586b-bd98-485d-9aea-b6e5a86349fd-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:46:27.009195 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009165 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-8db94848-d6qh7_a009586b-bd98-485d-9aea-b6e5a86349fd/main/0.log" Apr 22 19:46:27.009491 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009469 2569 generic.go:358] "Generic (PLEG): container finished" podID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerID="a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029" exitCode=137 Apr 22 19:46:27.009568 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009543 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" Apr 22 19:46:27.009628 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009557 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerDied","Data":"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029"} Apr 22 19:46:27.009628 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7" event={"ID":"a009586b-bd98-485d-9aea-b6e5a86349fd","Type":"ContainerDied","Data":"6e499aaeadd00178568b4cf24bdd4c08f734b61178143371c4a6b814fe5e5ab4"} Apr 22 19:46:27.009628 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.009615 2569 scope.go:117] "RemoveContainer" containerID="a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029" Apr 22 19:46:27.032815 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.032776 2569 scope.go:117] "RemoveContainer" containerID="d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1" Apr 22 19:46:27.034879 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.034855 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:46:27.040942 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.040909 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8db94848-d6qh7"] Apr 22 19:46:27.047147 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.047122 2569 scope.go:117] "RemoveContainer" containerID="a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029" Apr 22 19:46:27.047513 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:46:27.047485 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029\": container with ID starting with a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029 not found: ID does not exist" containerID="a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029" Apr 22 19:46:27.047596 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.047524 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029"} err="failed to get container status \"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029\": rpc error: code = NotFound desc = could not find container \"a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029\": container with ID starting with a9c038050ffca19b34717cc279dbdcb3fbaba35bb9563332a9c0e0de99168029 not found: ID does not exist" Apr 22 19:46:27.047596 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.047545 2569 scope.go:117] "RemoveContainer" containerID="d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1" Apr 22 19:46:27.047856 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:46:27.047835 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1\": container with ID starting with d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1 not found: ID does not exist" containerID="d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1" Apr 22 19:46:27.047909 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.047862 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1"} err="failed to get container status \"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1\": rpc error: code = NotFound desc = could not find container \"d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1\": container with ID starting with d015e72d586bcbc69954383684b0c745987313ed11e2a23381f0833209557ae1 not found: ID does not exist" Apr 22 19:46:27.952119 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:27.952069 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" path="/var/lib/kubelet/pods/a009586b-bd98-485d-9aea-b6e5a86349fd/volumes" Apr 22 19:46:28.474725 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:28.474680 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:46:28.487655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:28.487622 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:46:38.474760 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:38.474709 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:46:48.474168 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:48.474120 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:46:58.474472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:46:58.474428 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:00.039204 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039162 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:47:00.039584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039530 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="storage-initializer" Apr 22 19:47:00.039584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039545 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="storage-initializer" Apr 22 19:47:00.039584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039570 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" Apr 22 19:47:00.039584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039576 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" Apr 22 19:47:00.039744 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.039630 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a009586b-bd98-485d-9aea-b6e5a86349fd" containerName="main" Apr 22 19:47:00.044312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.044290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.047222 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.047195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:47:00.048287 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.048271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-kvsxc\"" Apr 22 19:47:00.055235 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.055204 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:47:00.128224 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.128400 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.128400 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128304 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.128400 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.128400 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrp5d\" (UniqueName: \"kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.128400 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.128378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.148574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.148538 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:47:00.152902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.152872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.155711 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.155682 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-hklvm\"" Apr 22 19:47:00.164409 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.164379 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:47:00.229420 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.229420 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229463 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pmqk\" (UniqueName: \"kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.229670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.230071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.230071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrp5d\" (UniqueName: \"kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.230071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.230071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.230071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.229988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.231911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.231890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.232177 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.232159 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.245739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.245702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrp5d\" (UniqueName: \"kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.330344 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330344 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330579 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330579 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pmqk\" (UniqueName: \"kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330579 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330579 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330806 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330806 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330921 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.330921 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.330859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.333000 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.332975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.339778 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.339743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pmqk\" (UniqueName: \"kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.356586 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.356540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:00.466416 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.466361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:00.515597 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.515565 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:47:00.517792 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:47:00.517757 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e1fe26_aa32_4431_b679_4cfc6b9dc645.slice/crio-b772fc206615657c855e0a456251a7989cc155429ea4c105adb4c513039afa76 WatchSource:0}: Error finding container b772fc206615657c855e0a456251a7989cc155429ea4c105adb4c513039afa76: Status 404 returned error can't find the container with id b772fc206615657c855e0a456251a7989cc155429ea4c105adb4c513039afa76 Apr 22 19:47:00.655696 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:00.655653 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:47:00.659081 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:47:00.659049 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0456f161_8adf_4f24_a254_2e3c418c21a4.slice/crio-01e042fe5bf84931596111c11c102a6bdc6b9c5d0ef145d4623831d451a28810 WatchSource:0}: Error finding container 01e042fe5bf84931596111c11c102a6bdc6b9c5d0ef145d4623831d451a28810: Status 404 returned error can't find the container with id 01e042fe5bf84931596111c11c102a6bdc6b9c5d0ef145d4623831d451a28810 Apr 22 19:47:01.155028 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:01.154987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerStarted","Data":"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00"} Apr 22 19:47:01.155633 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:01.155605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerStarted","Data":"01e042fe5bf84931596111c11c102a6bdc6b9c5d0ef145d4623831d451a28810"} Apr 22 19:47:01.156737 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:01.156712 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerStarted","Data":"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30"} Apr 22 19:47:01.156848 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:01.156746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerStarted","Data":"b772fc206615657c855e0a456251a7989cc155429ea4c105adb4c513039afa76"} Apr 22 19:47:02.164117 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:02.164064 2569 generic.go:358] "Generic (PLEG): container finished" podID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerID="dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00" exitCode=0 Apr 22 19:47:02.164583 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:02.164154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerDied","Data":"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00"} Apr 22 19:47:03.170663 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:03.170624 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerStarted","Data":"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899"} Apr 22 19:47:03.170663 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:03.170661 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerStarted","Data":"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06"} Apr 22 19:47:03.171279 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:03.170790 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:03.197034 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:03.196957 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" podStartSLOduration=3.196939527 podStartE2EDuration="3.196939527s" podCreationTimestamp="2026-04-22 19:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:47:03.193461097 +0000 UTC m=+1395.841475927" watchObservedRunningTime="2026-04-22 19:47:03.196939527 +0000 UTC m=+1395.844954331" Apr 22 19:47:06.185338 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:06.185305 2569 generic.go:358] "Generic (PLEG): container finished" podID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerID="c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30" exitCode=0 Apr 22 19:47:06.185740 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:06.185354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerDied","Data":"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30"} Apr 22 19:47:06.186712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:06.186692 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:47:07.192137 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:07.192082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerStarted","Data":"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697"} Apr 22 19:47:07.215641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:07.215562 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.215539766 podStartE2EDuration="7.215539766s" podCreationTimestamp="2026-04-22 19:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:47:07.211588636 +0000 UTC m=+1399.859603469" watchObservedRunningTime="2026-04-22 19:47:07.215539766 +0000 UTC m=+1399.863554571" Apr 22 19:47:08.474821 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:08.474771 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:10.357494 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:10.357442 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:10.359487 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:10.359421 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:47:10.467433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:10.467387 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:10.467433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:10.467442 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:10.470357 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:10.470329 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:11.211410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:11.211374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:18.474653 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:18.474599 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:20.357481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:20.357439 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:47:28.474309 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:28.474252 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:30.357244 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:30.357189 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:47:30.357756 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:30.357455 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:47:32.216227 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:32.216192 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:47:38.474566 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:38.474460 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:40.357359 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:40.357308 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:47:48.474979 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:48.474924 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8001/health\": dial tcp 10.133.0.51:8001: connect: connection refused" Apr 22 19:47:50.357209 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:50.357168 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:47:58.489792 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:58.489760 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:47:58.501683 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:47:58.501656 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:48:00.357828 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:00.357786 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:48:10.357425 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:10.357363 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:48:16.090829 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:16.090789 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:48:16.091467 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:16.091239 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" containerID="cri-o://b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" gracePeriod=30 Apr 22 19:48:20.357150 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:20.357064 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:48:23.405377 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.405335 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:48:23.410826 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.410801 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.414185 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.414159 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:48:23.421949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.421596 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:48:23.524322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.524535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.524535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwl8\" (UniqueName: \"kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.524535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524486 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.524535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.524712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.524579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625552 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625552 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwl8\" (UniqueName: \"kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625778 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625590 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625778 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625778 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.625946 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.626004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.625966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.626112 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.626069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.626253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.626218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.628354 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.628322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.628354 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.628347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.635634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.635605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwl8\" (UniqueName: \"kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8\") pod \"custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:23.723608 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:23.723568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:24.081866 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:24.081775 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:48:24.534084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:24.534047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerStarted","Data":"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551"} Apr 22 19:48:24.534084 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:24.534086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerStarted","Data":"c83b7c9da6cd2aacc6f8c027b8056c67af574470941c9eda9f5995a455a09697"} Apr 22 19:48:29.557469 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:29.557432 2569 generic.go:358] "Generic (PLEG): container finished" podID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerID="fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551" exitCode=0 Apr 22 19:48:29.557867 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:29.557507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerDied","Data":"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551"} Apr 22 19:48:30.357286 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:30.357227 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:48:30.563936 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:30.563889 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerStarted","Data":"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e"} Apr 22 19:48:30.599068 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:30.599006 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podStartSLOduration=7.598986935 podStartE2EDuration="7.598986935s" podCreationTimestamp="2026-04-22 19:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:30.597917947 +0000 UTC m=+1483.245932750" watchObservedRunningTime="2026-04-22 19:48:30.598986935 +0000 UTC m=+1483.247001740" Apr 22 19:48:33.724473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:33.724423 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:33.724473 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:33.724471 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:48:33.726066 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:33.726030 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:48:40.357459 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:40.357408 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 19:48:43.725211 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:43.725153 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:48:46.091431 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.091347 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="llm-d-routing-sidecar" containerID="cri-o://ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" gracePeriod=2 Apr 22 19:48:46.381527 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.381182 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm_571c0e08-d9fe-42ca-9c33-6e873d307ce4/main/0.log" Apr 22 19:48:46.382457 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.382403 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:48:46.431288 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431258 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfxzf\" (UniqueName: \"kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.431472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431339 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.431472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431367 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.431472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431390 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.431472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431411 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.431472 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.431437 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home\") pod \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\" (UID: \"571c0e08-d9fe-42ca-9c33-6e873d307ce4\") " Apr 22 19:48:46.432119 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.432056 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home" (OuterVolumeSpecName: "home") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.432255 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.432148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache" (OuterVolumeSpecName: "model-cache") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.433744 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.433715 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf" (OuterVolumeSpecName: "kube-api-access-cfxzf") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "kube-api-access-cfxzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:46.434319 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.434290 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm" (OuterVolumeSpecName: "dshm") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.434406 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.434319 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:46.516247 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.516184 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "571c0e08-d9fe-42ca-9c33-6e873d307ce4" (UID: "571c0e08-d9fe-42ca-9c33-6e873d307ce4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.532198 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532169 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/571c0e08-d9fe-42ca-9c33-6e873d307ce4-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.532198 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532195 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.532337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532205 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.532337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532216 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.532337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532223 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/571c0e08-d9fe-42ca-9c33-6e873d307ce4-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.532337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.532232 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cfxzf\" (UniqueName: \"kubernetes.io/projected/571c0e08-d9fe-42ca-9c33-6e873d307ce4-kube-api-access-cfxzf\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.630878 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.630789 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm_571c0e08-d9fe-42ca-9c33-6e873d307ce4/main/0.log" Apr 22 19:48:46.631482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631443 2569 generic.go:358] "Generic (PLEG): container finished" podID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerID="b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" exitCode=137 Apr 22 19:48:46.631482 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631472 2569 generic.go:358] "Generic (PLEG): container finished" podID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerID="ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" exitCode=0 Apr 22 19:48:46.631691 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerDied","Data":"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8"} Apr 22 19:48:46.631691 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerDied","Data":"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939"} Apr 22 19:48:46.631691 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631546 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" Apr 22 19:48:46.631691 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm" event={"ID":"571c0e08-d9fe-42ca-9c33-6e873d307ce4","Type":"ContainerDied","Data":"4b633d5cc8d63575b00e94f1124371b7faf2d23088bbbdf3a318002ffb2f3dd0"} Apr 22 19:48:46.631691 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.631580 2569 scope.go:117] "RemoveContainer" containerID="b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" Apr 22 19:48:46.652207 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.652178 2569 scope.go:117] "RemoveContainer" containerID="d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a" Apr 22 19:48:46.660544 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.660510 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:48:46.664408 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.664381 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-59965f84c7xm8cm"] Apr 22 19:48:46.721483 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.721455 2569 scope.go:117] "RemoveContainer" containerID="ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" Apr 22 19:48:46.732304 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.732283 2569 scope.go:117] "RemoveContainer" containerID="b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" Apr 22 19:48:46.732621 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:48:46.732602 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8\": container with ID starting with b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8 not found: ID does not exist" containerID="b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" Apr 22 19:48:46.732696 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.732630 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8"} err="failed to get container status \"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8\": rpc error: code = NotFound desc = could not find container \"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8\": container with ID starting with b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8 not found: ID does not exist" Apr 22 19:48:46.732696 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.732652 2569 scope.go:117] "RemoveContainer" containerID="d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a" Apr 22 19:48:46.732919 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:48:46.732896 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a\": container with ID starting with d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a not found: ID does not exist" containerID="d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a" Apr 22 19:48:46.732992 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.732930 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a"} err="failed to get container status \"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a\": rpc error: code = NotFound desc = could not find container \"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a\": container with ID starting with d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a not found: ID does not exist" Apr 22 19:48:46.732992 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.732956 2569 scope.go:117] "RemoveContainer" containerID="ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" Apr 22 19:48:46.733264 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:48:46.733246 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939\": container with ID starting with ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939 not found: ID does not exist" containerID="ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" Apr 22 19:48:46.733319 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733269 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939"} err="failed to get container status \"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939\": rpc error: code = NotFound desc = could not find container \"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939\": container with ID starting with ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939 not found: ID does not exist" Apr 22 19:48:46.733319 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733284 2569 scope.go:117] "RemoveContainer" containerID="b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8" Apr 22 19:48:46.733590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733564 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8"} err="failed to get container status \"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8\": rpc error: code = NotFound desc = could not find container \"b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8\": container with ID starting with b4a20bafdfd7d89400a347f4653bcbd6aae8ce2d0c67f47a10dd85937ba345a8 not found: ID does not exist" Apr 22 19:48:46.733590 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733589 2569 scope.go:117] "RemoveContainer" containerID="d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a" Apr 22 19:48:46.733815 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733794 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a"} err="failed to get container status \"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a\": rpc error: code = NotFound desc = could not find container \"d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a\": container with ID starting with d85f4fcb229371d358996f02a97ae811c18ed6fc6737408bbb0dce0a2c79676a not found: ID does not exist" Apr 22 19:48:46.733815 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.733814 2569 scope.go:117] "RemoveContainer" containerID="ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939" Apr 22 19:48:46.734030 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:46.734011 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939"} err="failed to get container status \"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939\": rpc error: code = NotFound desc = could not find container \"ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939\": container with ID starting with ae714202b22a7210cb46ad10d62d294828f509f97cc4f8391d04d74ea1c15939 not found: ID does not exist" Apr 22 19:48:47.953412 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:47.953373 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" path="/var/lib/kubelet/pods/571c0e08-d9fe-42ca-9c33-6e873d307ce4/volumes" Apr 22 19:48:47.994215 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:47.994185 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:48:47.996869 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:47.996844 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:48:50.366658 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:50.366630 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:48:50.375254 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:50.375221 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:48:53.724347 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:53.724300 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:48:58.742728 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:58.742695 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:48:58.743228 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:58.742987 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="main" containerID="cri-o://144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06" gracePeriod=30 Apr 22 19:48:58.743228 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:58.743059 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="tokenizer" containerID="cri-o://157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899" gracePeriod=30 Apr 22 19:48:58.757907 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:58.757864 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:48:58.758320 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:58.758290 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" containerID="cri-o://06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697" gracePeriod=30 Apr 22 19:48:59.640717 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.640687 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:48:59.695531 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.695491 2569 generic.go:358] "Generic (PLEG): container finished" podID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerID="144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06" exitCode=0 Apr 22 19:48:59.695735 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.695562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerDied","Data":"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06"} Apr 22 19:48:59.697316 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.697292 2569 generic.go:358] "Generic (PLEG): container finished" podID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerID="06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697" exitCode=0 Apr 22 19:48:59.697486 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.697375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerDied","Data":"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697"} Apr 22 19:48:59.697486 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.697422 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"54e1fe26-aa32-4431-b679-4cfc6b9dc645","Type":"ContainerDied","Data":"b772fc206615657c855e0a456251a7989cc155429ea4c105adb4c513039afa76"} Apr 22 19:48:59.697486 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.697443 2569 scope.go:117] "RemoveContainer" containerID="06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697" Apr 22 19:48:59.697486 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.697387 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 19:48:59.724441 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.724415 2569 scope.go:117] "RemoveContainer" containerID="c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30" Apr 22 19:48:59.763908 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.763861 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.763929 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.763966 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764011 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764050 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764122 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrp5d\" (UniqueName: \"kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d\") pod \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\" (UID: \"54e1fe26-aa32-4431-b679-4cfc6b9dc645\") " Apr 22 19:48:59.764410 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764279 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache" (OuterVolumeSpecName: "model-cache") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:59.764744 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764475 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:59.764804 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.764774 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home" (OuterVolumeSpecName: "home") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:59.767080 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.767049 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d" (OuterVolumeSpecName: "kube-api-access-lrp5d") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "kube-api-access-lrp5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:59.767401 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.767364 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm" (OuterVolumeSpecName: "dshm") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:59.767516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.767447 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:59.825134 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.824912 2569 scope.go:117] "RemoveContainer" containerID="06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697" Apr 22 19:48:59.825376 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:48:59.825342 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697\": container with ID starting with 06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697 not found: ID does not exist" containerID="06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697" Apr 22 19:48:59.825458 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.825387 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697"} err="failed to get container status \"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697\": rpc error: code = NotFound desc = could not find container \"06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697\": container with ID starting with 06e17e42c65207da7ec21f887176eb6a59206745bdf35806e27f73dde56bc697 not found: ID does not exist" Apr 22 19:48:59.825508 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.825461 2569 scope.go:117] "RemoveContainer" containerID="c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30" Apr 22 19:48:59.825839 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:48:59.825785 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30\": container with ID starting with c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30 not found: ID does not exist" containerID="c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30" Apr 22 19:48:59.825902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.825848 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30"} err="failed to get container status \"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30\": rpc error: code = NotFound desc = could not find container \"c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30\": container with ID starting with c2939b0af5653269339cfcd6e0a5cae22b3e8e72181f32a602e2cc83cb45fc30 not found: ID does not exist" Apr 22 19:48:59.839326 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.839277 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54e1fe26-aa32-4431-b679-4cfc6b9dc645" (UID: "54e1fe26-aa32-4431-b679-4cfc6b9dc645"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:59.865191 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.865119 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:59.865191 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.865186 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrp5d\" (UniqueName: \"kubernetes.io/projected/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kube-api-access-lrp5d\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:59.865452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.865203 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54e1fe26-aa32-4431-b679-4cfc6b9dc645-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:59.865452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.865220 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:48:59.865452 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:48:59.865234 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54e1fe26-aa32-4431-b679-4cfc6b9dc645-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.028160 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.028125 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:49:00.031511 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.031481 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 19:49:00.144614 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.144591 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:49:00.271980 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.271892 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.271980 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.271931 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.272258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272015 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.272258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272047 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.272258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272077 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pmqk\" (UniqueName: \"kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.272258 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272178 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp\") pod \"0456f161-8adf-4f24-a254-2e3c418c21a4\" (UID: \"0456f161-8adf-4f24-a254-2e3c418c21a4\") " Apr 22 19:49:00.272477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272309 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.272477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272320 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.272582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272505 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.272582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272545 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.272582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272549 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.272737 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.272678 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.274326 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.274307 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:49:00.274435 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.274357 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk" (OuterVolumeSpecName: "kube-api-access-2pmqk") pod "0456f161-8adf-4f24-a254-2e3c418c21a4" (UID: "0456f161-8adf-4f24-a254-2e3c418c21a4"). InnerVolumeSpecName "kube-api-access-2pmqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:49:00.373405 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.373367 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0456f161-8adf-4f24-a254-2e3c418c21a4-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.373405 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.373405 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pmqk\" (UniqueName: \"kubernetes.io/projected/0456f161-8adf-4f24-a254-2e3c418c21a4-kube-api-access-2pmqk\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.373604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.373422 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.373604 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.373438 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0456f161-8adf-4f24-a254-2e3c418c21a4-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.703988 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.703955 2569 generic.go:358] "Generic (PLEG): container finished" podID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerID="157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899" exitCode=0 Apr 22 19:49:00.704203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.704045 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" Apr 22 19:49:00.704203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.704048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerDied","Data":"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899"} Apr 22 19:49:00.704203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.704125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l" event={"ID":"0456f161-8adf-4f24-a254-2e3c418c21a4","Type":"ContainerDied","Data":"01e042fe5bf84931596111c11c102a6bdc6b9c5d0ef145d4623831d451a28810"} Apr 22 19:49:00.704203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.704150 2569 scope.go:117] "RemoveContainer" containerID="157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899" Apr 22 19:49:00.713295 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.713277 2569 scope.go:117] "RemoveContainer" containerID="144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06" Apr 22 19:49:00.721233 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.721214 2569 scope.go:117] "RemoveContainer" containerID="dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00" Apr 22 19:49:00.729402 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.729379 2569 scope.go:117] "RemoveContainer" containerID="157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899" Apr 22 19:49:00.729769 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:00.729735 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899\": container with ID starting with 157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899 not found: ID does not exist" containerID="157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899" Apr 22 19:49:00.729887 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.729778 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899"} err="failed to get container status \"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899\": rpc error: code = NotFound desc = could not find container \"157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899\": container with ID starting with 157a3d86bf74a357186438b3b68e5789a9237f7d6439fe004ac41deebddeb899 not found: ID does not exist" Apr 22 19:49:00.729887 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.729805 2569 scope.go:117] "RemoveContainer" containerID="144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06" Apr 22 19:49:00.730140 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:00.730089 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06\": container with ID starting with 144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06 not found: ID does not exist" containerID="144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06" Apr 22 19:49:00.730264 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.730146 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06"} err="failed to get container status \"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06\": rpc error: code = NotFound desc = could not find container \"144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06\": container with ID starting with 144289bdb588a97bc50d7ef08453cdded52714a3eaa352e45a2f63fcfc4e8f06 not found: ID does not exist" Apr 22 19:49:00.730264 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.730168 2569 scope.go:117] "RemoveContainer" containerID="dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00" Apr 22 19:49:00.730745 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:00.730723 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00\": container with ID starting with dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00 not found: ID does not exist" containerID="dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00" Apr 22 19:49:00.730852 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.730750 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00"} err="failed to get container status \"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00\": rpc error: code = NotFound desc = could not find container \"dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00\": container with ID starting with dbb66a5b66c135c7cb04aec438385a15025eb74c772d6af8afe9520862d58d00 not found: ID does not exist" Apr 22 19:49:00.732117 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.732075 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:49:00.741655 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:00.741626 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schenhj7l"] Apr 22 19:49:01.952115 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:01.952072 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" path="/var/lib/kubelet/pods/0456f161-8adf-4f24-a254-2e3c418c21a4/volumes" Apr 22 19:49:01.952601 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:01.952553 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" path="/var/lib/kubelet/pods/54e1fe26-aa32-4431-b679-4cfc6b9dc645/volumes" Apr 22 19:49:03.724778 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:03.724731 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:49:10.011392 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011355 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:10.011876 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011855 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011880 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011909 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011917 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011927 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="tokenizer" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011935 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="tokenizer" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011949 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="storage-initializer" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011958 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="storage-initializer" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011967 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="storage-initializer" Apr 22 19:49:10.011974 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011976 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="storage-initializer" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011987 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="main" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.011995 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="main" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012009 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="storage-initializer" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012017 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="storage-initializer" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012030 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="llm-d-routing-sidecar" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012039 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="llm-d-routing-sidecar" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012141 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="main" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012156 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="main" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012166 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="571c0e08-d9fe-42ca-9c33-6e873d307ce4" containerName="llm-d-routing-sidecar" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012176 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0456f161-8adf-4f24-a254-2e3c418c21a4" containerName="tokenizer" Apr 22 19:49:10.012417 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.012187 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="54e1fe26-aa32-4431-b679-4cfc6b9dc645" containerName="main" Apr 22 19:49:10.017187 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.017164 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.019949 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.019922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 19:49:10.024549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.024523 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:10.161575 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.161761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.161761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgt6\" (UniqueName: \"kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.161761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.161761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.161761 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.161746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262439 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262439 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262685 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgt6\" (UniqueName: \"kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262685 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262685 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262685 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262933 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262933 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.262933 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.262870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.265369 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.265344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.265649 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.265628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.273907 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.273876 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:10.277593 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.277566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgt6\" (UniqueName: \"kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6\") pod \"scheduler-inline-config-test-kserve-d7d757d45-99hrx\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.278748 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.278727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.281812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.281791 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-422v5\"" Apr 22 19:49:10.288156 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.288124 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:10.331073 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.331027 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:10.464752 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464699 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.464752 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.465032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.465032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464797 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85j6\" (UniqueName: \"kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.465032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464842 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.465032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.464900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.481886 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.481848 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:10.485780 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:49:10.485746 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e3091f9_7017_4dd8_a6fa_fe0edf5948f1.slice/crio-4caea15188a0ac3292704dc71427310734673ef6552d8a87f524b5d550001bb2 WatchSource:0}: Error finding container 4caea15188a0ac3292704dc71427310734673ef6552d8a87f524b5d550001bb2: Status 404 returned error can't find the container with id 4caea15188a0ac3292704dc71427310734673ef6552d8a87f524b5d550001bb2 Apr 22 19:49:10.565997 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.565946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566189 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566407 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566407 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h85j6\" (UniqueName: \"kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566407 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566564 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566717 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.566790 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.566735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.568980 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.568963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.575958 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.575923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85j6\" (UniqueName: \"kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.599381 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.599335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:10.751427 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.751202 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:10.751798 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.751766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerStarted","Data":"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe"} Apr 22 19:49:10.751940 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:10.751811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerStarted","Data":"4caea15188a0ac3292704dc71427310734673ef6552d8a87f524b5d550001bb2"} Apr 22 19:49:10.753475 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:49:10.753440 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a18471_e62b_4301_a786_a8c857beeb20.slice/crio-8dbbe228e9949582ef3e89df80eccc1789454b4bff020f020bd4d825cd681cd5 WatchSource:0}: Error finding container 8dbbe228e9949582ef3e89df80eccc1789454b4bff020f020bd4d825cd681cd5: Status 404 returned error can't find the container with id 8dbbe228e9949582ef3e89df80eccc1789454b4bff020f020bd4d825cd681cd5 Apr 22 19:49:11.757858 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:11.757823 2569 generic.go:358] "Generic (PLEG): container finished" podID="83a18471-e62b-4301-a786-a8c857beeb20" containerID="fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c" exitCode=0 Apr 22 19:49:11.758398 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:11.757961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerDied","Data":"fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c"} Apr 22 19:49:11.758398 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:11.757992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerStarted","Data":"8dbbe228e9949582ef3e89df80eccc1789454b4bff020f020bd4d825cd681cd5"} Apr 22 19:49:12.764600 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:12.764551 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerStarted","Data":"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c"} Apr 22 19:49:12.764600 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:12.764597 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerStarted","Data":"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed"} Apr 22 19:49:12.765210 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:12.764648 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:12.792966 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:12.792892 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" podStartSLOduration=2.792863701 podStartE2EDuration="2.792863701s" podCreationTimestamp="2026-04-22 19:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:49:12.788297937 +0000 UTC m=+1525.436312768" watchObservedRunningTime="2026-04-22 19:49:12.792863701 +0000 UTC m=+1525.440878506" Apr 22 19:49:13.724537 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:13.724487 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:49:20.599796 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:20.599755 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:20.600311 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:20.599808 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:20.602457 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:20.602427 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:20.803058 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:20.803023 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:21.806148 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:21.806091 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerID="4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe" exitCode=0 Apr 22 19:49:21.806615 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:21.806170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerDied","Data":"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe"} Apr 22 19:49:22.812223 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:22.812185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerStarted","Data":"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93"} Apr 22 19:49:22.838700 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:22.838649 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" podStartSLOduration=13.83863382 podStartE2EDuration="13.83863382s" podCreationTimestamp="2026-04-22 19:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:49:22.835990002 +0000 UTC m=+1535.484004847" watchObservedRunningTime="2026-04-22 19:49:22.83863382 +0000 UTC m=+1535.486648623" Apr 22 19:49:23.725020 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:23.724974 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:49:30.331641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:30.331601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:30.331641 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:30.331644 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:30.344584 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:30.344557 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:30.859574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:30.859546 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:33.724312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:33.724260 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:49:41.808692 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:41.808651 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:43.040457 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.040085 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:43.041503 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.041437 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="main" containerID="cri-o://21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93" gracePeriod=30 Apr 22 19:49:43.045191 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.045147 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:43.045585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.045534 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="main" containerID="cri-o://7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed" gracePeriod=30 Apr 22 19:49:43.045702 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.045583 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="tokenizer" containerID="cri-o://0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c" gracePeriod=30 Apr 22 19:49:43.312713 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.312688 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:43.474905 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.474858 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmgt6\" (UniqueName: \"kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.474996 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475044 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475073 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475127 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475121 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475371 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475153 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache\") pod \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\" (UID: \"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1\") " Apr 22 19:49:43.475371 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475286 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home" (OuterVolumeSpecName: "home") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:43.475501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475471 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.475501 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.475470 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache" (OuterVolumeSpecName: "model-cache") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:43.477408 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.477376 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm" (OuterVolumeSpecName: "dshm") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:43.477555 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.477524 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6" (OuterVolumeSpecName: "kube-api-access-jmgt6") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "kube-api-access-jmgt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:49:43.477625 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.477556 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:49:43.549338 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.549241 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" (UID: "1e3091f9-7017-4dd8-a6fa-fe0edf5948f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:43.576344 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.576295 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.576344 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.576338 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.576344 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.576352 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.576595 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.576383 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.576595 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.576398 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jmgt6\" (UniqueName: \"kubernetes.io/projected/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1-kube-api-access-jmgt6\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:43.724754 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.724712 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:49:43.903155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.903044 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerID="21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93" exitCode=0 Apr 22 19:49:43.903155 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.903136 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerDied","Data":"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93"} Apr 22 19:49:43.903395 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.903155 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" Apr 22 19:49:43.903395 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.903181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx" event={"ID":"1e3091f9-7017-4dd8-a6fa-fe0edf5948f1","Type":"ContainerDied","Data":"4caea15188a0ac3292704dc71427310734673ef6552d8a87f524b5d550001bb2"} Apr 22 19:49:43.903395 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.903201 2569 scope.go:117] "RemoveContainer" containerID="21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93" Apr 22 19:49:43.906005 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.905964 2569 generic.go:358] "Generic (PLEG): container finished" podID="83a18471-e62b-4301-a786-a8c857beeb20" containerID="7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed" exitCode=0 Apr 22 19:49:43.906183 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.906160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerDied","Data":"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed"} Apr 22 19:49:43.914266 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.914215 2569 scope.go:117] "RemoveContainer" containerID="4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe" Apr 22 19:49:43.931556 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.931513 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:43.936303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.936271 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d7d757d45-99hrx"] Apr 22 19:49:43.951692 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.951661 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" path="/var/lib/kubelet/pods/1e3091f9-7017-4dd8-a6fa-fe0edf5948f1/volumes" Apr 22 19:49:43.986570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.986541 2569 scope.go:117] "RemoveContainer" containerID="21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93" Apr 22 19:49:43.986912 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:43.986890 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93\": container with ID starting with 21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93 not found: ID does not exist" containerID="21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93" Apr 22 19:49:43.986985 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.986925 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93"} err="failed to get container status \"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93\": rpc error: code = NotFound desc = could not find container \"21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93\": container with ID starting with 21795ed1e9788e073224880b7cb608c74fb67ff246f9c79cb042c198a7308d93 not found: ID does not exist" Apr 22 19:49:43.986985 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.986946 2569 scope.go:117] "RemoveContainer" containerID="4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe" Apr 22 19:49:43.987341 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:43.987313 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe\": container with ID starting with 4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe not found: ID does not exist" containerID="4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe" Apr 22 19:49:43.987466 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:43.987344 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe"} err="failed to get container status \"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe\": rpc error: code = NotFound desc = could not find container \"4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe\": container with ID starting with 4d6a5b124aee5784874c0215f47d1370382bb3ede382280f56e6e9d265ba4ffe not found: ID does not exist" Apr 22 19:49:44.422998 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.422972 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:44.585574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585474 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.585574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585546 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h85j6\" (UniqueName: \"kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.585811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585581 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.585811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585635 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.585811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585681 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.585811 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585706 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache\") pod \"83a18471-e62b-4301-a786-a8c857beeb20\" (UID: \"83a18471-e62b-4301-a786-a8c857beeb20\") " Apr 22 19:49:44.586032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.585886 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:44.586032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586003 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:44.586032 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586021 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:44.586414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586320 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-tmp\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.586414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586343 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-uds\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.586414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586358 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-tokenizer-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.586414 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.586368 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:44.587776 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.587753 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6" (OuterVolumeSpecName: "kube-api-access-h85j6") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "kube-api-access-h85j6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:49:44.587879 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.587770 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "83a18471-e62b-4301-a786-a8c857beeb20" (UID: "83a18471-e62b-4301-a786-a8c857beeb20"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:49:44.687029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.686985 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a18471-e62b-4301-a786-a8c857beeb20-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.687029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.687029 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83a18471-e62b-4301-a786-a8c857beeb20-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.687236 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.687043 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h85j6\" (UniqueName: \"kubernetes.io/projected/83a18471-e62b-4301-a786-a8c857beeb20-kube-api-access-h85j6\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:49:44.920847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.920766 2569 generic.go:358] "Generic (PLEG): container finished" podID="83a18471-e62b-4301-a786-a8c857beeb20" containerID="0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c" exitCode=0 Apr 22 19:49:44.920847 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.920826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerDied","Data":"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c"} Apr 22 19:49:44.921044 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.920851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" event={"ID":"83a18471-e62b-4301-a786-a8c857beeb20","Type":"ContainerDied","Data":"8dbbe228e9949582ef3e89df80eccc1789454b4bff020f020bd4d825cd681cd5"} Apr 22 19:49:44.921044 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.920867 2569 scope.go:117] "RemoveContainer" containerID="0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c" Apr 22 19:49:44.921044 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.920896 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2" Apr 22 19:49:44.929877 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.929856 2569 scope.go:117] "RemoveContainer" containerID="7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed" Apr 22 19:49:44.938975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.938955 2569 scope.go:117] "RemoveContainer" containerID="fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c" Apr 22 19:49:44.950167 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.950120 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:44.951301 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.951252 2569 scope.go:117] "RemoveContainer" containerID="0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c" Apr 22 19:49:44.951799 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:44.951601 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c\": container with ID starting with 0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c not found: ID does not exist" containerID="0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c" Apr 22 19:49:44.951799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.951640 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c"} err="failed to get container status \"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c\": rpc error: code = NotFound desc = could not find container \"0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c\": container with ID starting with 0526107a9adbaf0905f01c846a0dd4f5ac7c2182a63679908ff7f5b4a6c0cb8c not found: ID does not exist" Apr 22 19:49:44.951799 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.951668 2569 scope.go:117] "RemoveContainer" containerID="7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed" Apr 22 19:49:44.952038 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:44.951965 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed\": container with ID starting with 7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed not found: ID does not exist" containerID="7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed" Apr 22 19:49:44.952038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.951998 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed"} err="failed to get container status \"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed\": rpc error: code = NotFound desc = could not find container \"7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed\": container with ID starting with 7c9786acb1392c85596ea8d5f453783756f44661fb43cc51d3f72887225613ed not found: ID does not exist" Apr 22 19:49:44.952038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.952020 2569 scope.go:117] "RemoveContainer" containerID="fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c" Apr 22 19:49:44.952458 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:49:44.952426 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c\": container with ID starting with fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c not found: ID does not exist" containerID="fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c" Apr 22 19:49:44.952580 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.952457 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c"} err="failed to get container status \"fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c\": rpc error: code = NotFound desc = could not find container \"fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c\": container with ID starting with fdf707fe71fce0317dec0beb2cfaaf18884d32b16f32b50195f9ef55f07cca4c not found: ID does not exist" Apr 22 19:49:44.954276 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:44.954252 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-54c6bm75m2"] Apr 22 19:49:45.951924 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:45.951880 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a18471-e62b-4301-a786-a8c857beeb20" path="/var/lib/kubelet/pods/83a18471-e62b-4301-a786-a8c857beeb20/volumes" Apr 22 19:49:53.724290 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:49:53.724243 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:50:03.724994 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:03.724950 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 19:50:13.734070 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:13.734034 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:50:13.743164 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:13.742336 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:50:25.629610 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.629574 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:50:25.630596 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.630559 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" containerID="cri-o://43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e" gracePeriod=30 Apr 22 19:50:25.639151 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.639118 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq"] Apr 22 19:50:25.640812 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.640787 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="tokenizer" Apr 22 19:50:25.640965 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.640953 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="tokenizer" Apr 22 19:50:25.641071 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641060 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="storage-initializer" Apr 22 19:50:25.641180 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641167 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="storage-initializer" Apr 22 19:50:25.641305 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641292 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="main" Apr 22 19:50:25.641386 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641376 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="main" Apr 22 19:50:25.641481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641472 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="storage-initializer" Apr 22 19:50:25.641559 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641550 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="storage-initializer" Apr 22 19:50:25.641631 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641622 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="main" Apr 22 19:50:25.641724 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641714 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="main" Apr 22 19:50:25.641946 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.641931 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="tokenizer" Apr 22 19:50:25.642039 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.642029 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="83a18471-e62b-4301-a786-a8c857beeb20" containerName="main" Apr 22 19:50:25.642159 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.642138 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e3091f9-7017-4dd8-a6fa-fe0edf5948f1" containerName="main" Apr 22 19:50:25.651391 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.651364 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.660528 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.660498 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-v4b9h\"" Apr 22 19:50:25.661856 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.661815 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq"] Apr 22 19:50:25.736789 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.736750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.736789 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.736790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.736996 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.736820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9d2\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-kube-api-access-zt9d2\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.736996 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.736955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/26c42907-e92b-4f55-9103-60da23e816de-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.737074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.737010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.737074 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.737054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.737170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.737144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.737214 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.737197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.737293 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.737253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/26c42907-e92b-4f55-9103-60da23e816de-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838404 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838609 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9d2\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-kube-api-access-zt9d2\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838609 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/26c42907-e92b-4f55-9103-60da23e816de-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838609 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838675 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838847 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.838911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.839066 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/26c42907-e92b-4f55-9103-60da23e816de-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.839066 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.838966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.839379 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.839357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.839450 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.839377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.839623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.839582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/26c42907-e92b-4f55-9103-60da23e816de-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.841570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.841546 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/26c42907-e92b-4f55-9103-60da23e816de-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.841679 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.841659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/26c42907-e92b-4f55-9103-60da23e816de-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.847347 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.847321 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.847520 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.847499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9d2\" (UniqueName: \"kubernetes.io/projected/26c42907-e92b-4f55-9103-60da23e816de-kube-api-access-zt9d2\") pod \"router-gateway-2-openshift-default-6866b85949-lbprq\" (UID: \"26c42907-e92b-4f55-9103-60da23e816de\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:25.970775 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:25.970742 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:26.129851 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:26.129813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq"] Apr 22 19:50:26.133167 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:50:26.133134 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c42907_e92b_4f55_9103_60da23e816de.slice/crio-a9e73fda5370af5700cccd4691ba36b936ae3ab0f5ce4aded930a4b17416ed2f WatchSource:0}: Error finding container a9e73fda5370af5700cccd4691ba36b936ae3ab0f5ce4aded930a4b17416ed2f: Status 404 returned error can't find the container with id a9e73fda5370af5700cccd4691ba36b936ae3ab0f5ce4aded930a4b17416ed2f Apr 22 19:50:26.135281 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:26.135246 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:50:26.135403 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:26.135322 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:50:26.135403 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:26.135364 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:50:27.088902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:27.088864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" event={"ID":"26c42907-e92b-4f55-9103-60da23e816de","Type":"ContainerStarted","Data":"fe2b2fc9ba09397e9f2726e6d93936ad9f0565fa2f201e7655215b1a13580f93"} Apr 22 19:50:27.088902 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:27.088909 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" event={"ID":"26c42907-e92b-4f55-9103-60da23e816de","Type":"ContainerStarted","Data":"a9e73fda5370af5700cccd4691ba36b936ae3ab0f5ce4aded930a4b17416ed2f"} Apr 22 19:50:27.113232 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:27.113169 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" podStartSLOduration=2.113152843 podStartE2EDuration="2.113152843s" podCreationTimestamp="2026-04-22 19:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:50:27.111510285 +0000 UTC m=+1599.759525090" watchObservedRunningTime="2026-04-22 19:50:27.113152843 +0000 UTC m=+1599.761167647" Apr 22 19:50:27.973349 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:27.973312 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:27.973699 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:27.973674 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" podUID="26c42907-e92b-4f55-9103-60da23e816de" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.57:15021/healthz/ready\": dial tcp 10.133.0.57:15021: connect: connection refused" Apr 22 19:50:28.972350 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:28.972305 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" podUID="26c42907-e92b-4f55-9103-60da23e816de" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.57:15021/healthz/ready\": dial tcp 10.133.0.57:15021: connect: connection refused" Apr 22 19:50:29.975620 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:29.975589 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:30.099924 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:30.099894 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:30.100804 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:30.100782 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lbprq" Apr 22 19:50:40.455324 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.455230 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:50:40.460780 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.460755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.465062 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.465039 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:50:40.466047 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.466020 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-f459q\"" Apr 22 19:50:40.524873 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.524841 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:50:40.580007 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.579971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.580212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.580071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.580212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.580132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.580212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.580170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghnxs\" (UniqueName: \"kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.580212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.580194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.580212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.580214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681267 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghnxs\" (UniqueName: \"kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681481 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681781 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681839 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.681839 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.681810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.683846 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.683821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.684008 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.683990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.692901 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.692868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghnxs\" (UniqueName: \"kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs\") pod \"router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.771008 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.770917 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:40.919700 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:40.919676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:50:40.921880 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:50:40.921847 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aaea256_a957_4b0f_a5a1_ce279eedb5b4.slice/crio-1ab4154fdaf843528a1385dfab8dfda44fdf122ff31d554ab0f8cbce150993b2 WatchSource:0}: Error finding container 1ab4154fdaf843528a1385dfab8dfda44fdf122ff31d554ab0f8cbce150993b2: Status 404 returned error can't find the container with id 1ab4154fdaf843528a1385dfab8dfda44fdf122ff31d554ab0f8cbce150993b2 Apr 22 19:50:41.141019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:41.140923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerStarted","Data":"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b"} Apr 22 19:50:41.141019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:41.140963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerStarted","Data":"1ab4154fdaf843528a1385dfab8dfda44fdf122ff31d554ab0f8cbce150993b2"} Apr 22 19:50:41.141019 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:41.140987 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:42.146522 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:42.146482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerStarted","Data":"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4"} Apr 22 19:50:53.162072 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:53.162039 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:50:53.190226 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:53.190194 2569 generic.go:358] "Generic (PLEG): container finished" podID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerID="138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4" exitCode=0 Apr 22 19:50:53.190432 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:53.190258 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerDied","Data":"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4"} Apr 22 19:50:54.196250 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:54.196212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerStarted","Data":"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4"} Apr 22 19:50:54.222845 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:54.222794 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podStartSLOduration=14.222778258 podStartE2EDuration="14.222778258s" podCreationTimestamp="2026-04-22 19:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:50:54.219923207 +0000 UTC m=+1626.867938023" watchObservedRunningTime="2026-04-22 19:50:54.222778258 +0000 UTC m=+1626.870793063" Apr 22 19:50:55.912650 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.912624 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:50:55.921794 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.921770 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.921935 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.921818 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.921999 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.921981 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.922063 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922024 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache" (OuterVolumeSpecName: "model-cache") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:55.922137 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922059 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvwl8\" (UniqueName: \"kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.922199 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922134 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.922199 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922175 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm\") pod \"56291835-4550-4b8a-921a-fd31c4d1d1d5\" (UID: \"56291835-4550-4b8a-921a-fd31c4d1d1d5\") " Apr 22 19:50:55.922450 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922418 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:55.922715 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.922688 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home" (OuterVolumeSpecName: "home") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:55.924503 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.924234 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:50:55.924612 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.924549 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8" (OuterVolumeSpecName: "kube-api-access-fvwl8") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "kube-api-access-fvwl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:50:55.924816 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.924788 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm" (OuterVolumeSpecName: "dshm") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:55.984981 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:55.984938 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "56291835-4550-4b8a-921a-fd31c4d1d1d5" (UID: "56291835-4550-4b8a-921a-fd31c4d1d1d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:56.023029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.022994 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.023029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.023025 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.023029 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.023034 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56291835-4550-4b8a-921a-fd31c4d1d1d5-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.023376 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.023045 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56291835-4550-4b8a-921a-fd31c4d1d1d5-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.023376 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.023058 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvwl8\" (UniqueName: \"kubernetes.io/projected/56291835-4550-4b8a-921a-fd31c4d1d1d5-kube-api-access-fvwl8\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.205502 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.205469 2569 generic.go:358] "Generic (PLEG): container finished" podID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerID="43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e" exitCode=137 Apr 22 19:50:56.205714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.205521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerDied","Data":"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e"} Apr 22 19:50:56.205714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.205555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" event={"ID":"56291835-4550-4b8a-921a-fd31c4d1d1d5","Type":"ContainerDied","Data":"c83b7c9da6cd2aacc6f8c027b8056c67af574470941c9eda9f5995a455a09697"} Apr 22 19:50:56.205714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.205557 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp" Apr 22 19:50:56.205714 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.205571 2569 scope.go:117] "RemoveContainer" containerID="43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e" Apr 22 19:50:56.229057 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.229032 2569 scope.go:117] "RemoveContainer" containerID="fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551" Apr 22 19:50:56.231739 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.231706 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:50:56.239008 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.238981 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-79b6dd4d5-m2jkp"] Apr 22 19:50:56.298771 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.298744 2569 scope.go:117] "RemoveContainer" containerID="43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e" Apr 22 19:50:56.299269 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:50:56.299122 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e\": container with ID starting with 43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e not found: ID does not exist" containerID="43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e" Apr 22 19:50:56.299269 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.299161 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e"} err="failed to get container status \"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e\": rpc error: code = NotFound desc = could not find container \"43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e\": container with ID starting with 43ca8d2025ab846d199b4817c980743a44da71c34ae187f0068a76ff4a89a72e not found: ID does not exist" Apr 22 19:50:56.299269 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.299187 2569 scope.go:117] "RemoveContainer" containerID="fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551" Apr 22 19:50:56.299486 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:50:56.299465 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551\": container with ID starting with fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551 not found: ID does not exist" containerID="fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551" Apr 22 19:50:56.299527 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:56.299492 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551"} err="failed to get container status \"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551\": rpc error: code = NotFound desc = could not find container \"fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551\": container with ID starting with fe1dc8a3c3d708502ebf63de479d1342b3a27ab5849471ae997e9d882c534551 not found: ID does not exist" Apr 22 19:50:57.952922 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:50:57.952876 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" path="/var/lib/kubelet/pods/56291835-4550-4b8a-921a-fd31c4d1d1d5/volumes" Apr 22 19:51:00.771236 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:00.771191 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:51:00.771742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:00.771255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:51:00.771742 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:00.771539 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:51:10.771805 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:10.771762 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:51:20.771643 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:20.771592 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:51:30.771874 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:30.771817 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:51:40.771800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:40.771749 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:51:50.771971 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:51:50.771898 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:52:00.771462 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:00.771399 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:52:10.772179 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:10.772120 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:52:20.771445 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:20.771387 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8001/health\": dial tcp 10.133.0.58:8001: connect: connection refused" Apr 22 19:52:30.780762 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:30.780728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:52:30.792667 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:30.792639 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:52:42.377572 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:42.377538 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:52:42.378111 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:42.378037 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" containerID="cri-o://8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" gracePeriod=30 Apr 22 19:52:57.088712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:57.088666 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:52:57.115939 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:57.115905 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:52:57.138893 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:57.138845 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:52:57.145010 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:57.144983 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:52:57.152914 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:57.152871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:52:58.120294 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:58.120246 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:52:58.135818 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:58.135773 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:52:58.160243 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:58.160206 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:52:58.166636 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:58.166613 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:52:58.175433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:58.175396 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:52:59.108665 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:59.108629 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:52:59.122779 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:59.122739 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:52:59.144457 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:59.144421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:52:59.150267 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:59.150239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:52:59.158038 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:52:59.158015 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:00.103774 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:00.103722 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:00.116802 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:00.116765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:00.139541 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:00.139514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:00.146926 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:00.146900 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:00.155181 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:00.155149 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:01.092086 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:01.092054 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:01.107679 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:01.107653 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:01.134505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:01.134479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:01.147868 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:01.147840 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:01.157276 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:01.157246 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:02.110203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:02.110168 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:02.124164 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:02.124130 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:02.146629 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:02.146597 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:02.152712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:02.152686 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:02.161268 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:02.161235 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:03.101063 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:03.101031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:03.113618 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:03.113595 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:03.135645 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:03.135617 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:03.141488 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:03.141460 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:03.150694 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:03.150645 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:04.093222 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:04.093183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:04.106512 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:04.106486 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:04.135430 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:04.135401 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:04.141651 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:04.141621 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:04.150651 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:04.150613 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:05.103303 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:05.103266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:05.116214 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:05.116183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:05.150150 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:05.150121 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:05.169059 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:05.169021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:05.206062 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:05.206033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:06.183181 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:06.183128 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:06.196685 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:06.196657 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:06.220868 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:06.220824 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:06.232021 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:06.231991 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:06.241471 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:06.241447 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:07.159722 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:07.159688 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:07.172248 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:07.172220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:07.194716 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:07.194687 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:07.200634 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:07.200609 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:07.208351 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:07.208321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:08.158443 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:08.158410 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:08.171122 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:08.171065 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:08.194080 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:08.194044 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:08.200157 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:08.200126 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:08.207549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:08.207528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:09.120056 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:09.120016 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:09.132729 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:09.132697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:09.156428 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:09.156400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:09.164549 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:09.164520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:09.172007 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:09.171980 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:10.091474 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:10.091441 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-n4lsf_30245c48-e56c-4e0b-a86d-74ffeda7575b/istio-proxy/0.log" Apr 22 19:53:10.121810 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:10.121784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lbprq_26c42907-e92b-4f55-9103-60da23e816de/istio-proxy/0.log" Apr 22 19:53:10.144708 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:10.144672 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:10.153773 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:10.153744 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/llm-d-routing-sidecar/0.log" Apr 22 19:53:10.161734 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:10.161701 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/storage-initializer/0.log" Apr 22 19:53:11.152618 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:11.152583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8q9q2_71df9112-0f4c-45a0-8daa-f289e7cccb4f/istio-proxy/0.log" Apr 22 19:53:11.958892 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:11.958859 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8q9q2_71df9112-0f4c-45a0-8daa-f289e7cccb4f/istio-proxy/0.log" Apr 22 19:53:12.378906 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.378793 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="llm-d-routing-sidecar" containerID="cri-o://9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" gracePeriod=2 Apr 22 19:53:12.643606 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.643580 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:12.644253 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.644235 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:53:12.731009 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.730981 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-txgjc_5d7e87a7-8512-4326-8ce5-2e37c4dc83bf/manager/0.log" Apr 22 19:53:12.746503 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.746477 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2_0aaea256-a957-4b0f-a5a1-ce279eedb5b4/main/0.log" Apr 22 19:53:12.747170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747068 2569 generic.go:358] "Generic (PLEG): container finished" podID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerID="8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" exitCode=137 Apr 22 19:53:12.747170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerDied","Data":"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4"} Apr 22 19:53:12.747170 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747161 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" Apr 22 19:53:12.747352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747176 2569 generic.go:358] "Generic (PLEG): container finished" podID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerID="9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" exitCode=0 Apr 22 19:53:12.747352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerDied","Data":"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b"} Apr 22 19:53:12.747352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2" event={"ID":"0aaea256-a957-4b0f-a5a1-ce279eedb5b4","Type":"ContainerDied","Data":"1ab4154fdaf843528a1385dfab8dfda44fdf122ff31d554ab0f8cbce150993b2"} Apr 22 19:53:12.747352 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747219 2569 scope.go:117] "RemoveContainer" containerID="8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" Apr 22 19:53:12.747975 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747955 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghnxs\" (UniqueName: \"kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.747985 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748026 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748123 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748063 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748127 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748375 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748163 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs\") pod \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\" (UID: \"0aaea256-a957-4b0f-a5a1-ce279eedb5b4\") " Apr 22 19:53:12.748993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748615 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache" (OuterVolumeSpecName: "model-cache") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.748993 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.748734 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home" (OuterVolumeSpecName: "home") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.750572 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.750545 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs" (OuterVolumeSpecName: "kube-api-access-ghnxs") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "kube-api-access-ghnxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:53:12.750889 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.750864 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:53:12.751066 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.751043 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm" (OuterVolumeSpecName: "dshm") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.774485 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.774450 2569 scope.go:117] "RemoveContainer" containerID="138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4" Apr 22 19:53:12.802319 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.802284 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-tgt6j_2be03193-fbc2-4c07-b0e5-a853c833c1fc/limitador/0.log" Apr 22 19:53:12.816516 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.816470 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-c62xk_41c4f928-0361-4948-8ed4-a14000b7e054/manager/0.log" Apr 22 19:53:12.819181 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.819146 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0aaea256-a957-4b0f-a5a1-ce279eedb5b4" (UID: "0aaea256-a957-4b0f-a5a1-ce279eedb5b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.849445 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849411 2569 scope.go:117] "RemoveContainer" containerID="9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" Apr 22 19:53:12.849734 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849707 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghnxs\" (UniqueName: \"kubernetes.io/projected/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kube-api-access-ghnxs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.849843 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849736 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-kserve-provision-location\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.849843 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849747 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-model-cache\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.849843 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849757 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-home\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.849843 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849767 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-dshm\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.849843 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.849802 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0aaea256-a957-4b0f-a5a1-ce279eedb5b4-tls-certs\") on node \"ip-10-0-140-242.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.857758 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.857736 2569 scope.go:117] "RemoveContainer" containerID="8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" Apr 22 19:53:12.858002 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:53:12.857981 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4\": container with ID starting with 8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4 not found: ID does not exist" containerID="8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" Apr 22 19:53:12.858055 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858011 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4"} err="failed to get container status \"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4\": rpc error: code = NotFound desc = could not find container \"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4\": container with ID starting with 8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4 not found: ID does not exist" Apr 22 19:53:12.858055 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858030 2569 scope.go:117] "RemoveContainer" containerID="138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4" Apr 22 19:53:12.858287 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:53:12.858269 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4\": container with ID starting with 138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4 not found: ID does not exist" containerID="138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4" Apr 22 19:53:12.858334 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858293 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4"} err="failed to get container status \"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4\": rpc error: code = NotFound desc = could not find container \"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4\": container with ID starting with 138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4 not found: ID does not exist" Apr 22 19:53:12.858334 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858310 2569 scope.go:117] "RemoveContainer" containerID="9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" Apr 22 19:53:12.858554 ip-10-0-140-242 kubenswrapper[2569]: E0422 19:53:12.858535 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b\": container with ID starting with 9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b not found: ID does not exist" containerID="9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" Apr 22 19:53:12.858623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858563 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b"} err="failed to get container status \"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b\": rpc error: code = NotFound desc = could not find container \"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b\": container with ID starting with 9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b not found: ID does not exist" Apr 22 19:53:12.858623 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858584 2569 scope.go:117] "RemoveContainer" containerID="8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4" Apr 22 19:53:12.858868 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858850 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4"} err="failed to get container status \"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4\": rpc error: code = NotFound desc = could not find container \"8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4\": container with ID starting with 8367bdc4ab270d454316912053f962daa905b711a6d36f810bded990b91a66c4 not found: ID does not exist" Apr 22 19:53:12.858915 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.858871 2569 scope.go:117] "RemoveContainer" containerID="138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4" Apr 22 19:53:12.859119 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.859076 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4"} err="failed to get container status \"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4\": rpc error: code = NotFound desc = could not find container \"138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4\": container with ID starting with 138a2bc4c42a62793c6d04e538170bad36f500a79086bbeead6bee3bf3c098a4 not found: ID does not exist" Apr 22 19:53:12.859119 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.859115 2569 scope.go:117] "RemoveContainer" containerID="9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b" Apr 22 19:53:12.859337 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:12.859316 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b"} err="failed to get container status \"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b\": rpc error: code = NotFound desc = could not find container \"9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b\": container with ID starting with 9e0daea3342e63b1edbc88a463f08a84772f8c08ad67b2b3a19874d9ed7ce96b not found: ID does not exist" Apr 22 19:53:13.072619 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:13.072581 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:53:13.074856 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:13.074832 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c488d7f4d-4wjh2"] Apr 22 19:53:13.952240 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:13.952207 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" path="/var/lib/kubelet/pods/0aaea256-a957-4b0f-a5a1-ce279eedb5b4/volumes" Apr 22 19:53:15.086807 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.086773 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x642z/must-gather-jzsx9"] Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087141 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="storage-initializer" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087154 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="storage-initializer" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087163 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="llm-d-routing-sidecar" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087169 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="llm-d-routing-sidecar" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087178 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087184 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087199 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087204 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087209 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="storage-initializer" Apr 22 19:53:15.087241 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087215 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="storage-initializer" Apr 22 19:53:15.087582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087263 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="llm-d-routing-sidecar" Apr 22 19:53:15.087582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087272 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="56291835-4550-4b8a-921a-fd31c4d1d1d5" containerName="main" Apr 22 19:53:15.087582 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.087278 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aaea256-a957-4b0f-a5a1-ce279eedb5b4" containerName="main" Apr 22 19:53:15.090261 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.090243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.093014 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.092991 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"openshift-service-ca.crt\"" Apr 22 19:53:15.094288 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.094265 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x642z\"/\"default-dockercfg-6r5dj\"" Apr 22 19:53:15.094415 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.094302 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"kube-root-ca.crt\"" Apr 22 19:53:15.098203 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.098012 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/must-gather-jzsx9"] Apr 22 19:53:15.171605 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.171572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b47aa9e1-6008-46eb-908b-3475509552f0-must-gather-output\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.171605 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.171607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnl6j\" (UniqueName: \"kubernetes.io/projected/b47aa9e1-6008-46eb-908b-3475509552f0-kube-api-access-hnl6j\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.273035 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.272999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b47aa9e1-6008-46eb-908b-3475509552f0-must-gather-output\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.273035 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.273039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnl6j\" (UniqueName: \"kubernetes.io/projected/b47aa9e1-6008-46eb-908b-3475509552f0-kube-api-access-hnl6j\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.273433 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.273409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b47aa9e1-6008-46eb-908b-3475509552f0-must-gather-output\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.282412 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.282377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnl6j\" (UniqueName: \"kubernetes.io/projected/b47aa9e1-6008-46eb-908b-3475509552f0-kube-api-access-hnl6j\") pod \"must-gather-jzsx9\" (UID: \"b47aa9e1-6008-46eb-908b-3475509552f0\") " pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.401254 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.401147 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/must-gather-jzsx9" Apr 22 19:53:15.533712 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.533684 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/must-gather-jzsx9"] Apr 22 19:53:15.535613 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:53:15.535587 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb47aa9e1_6008_46eb_908b_3475509552f0.slice/crio-d6af3d8eeca7301c5ad7ee128fc44b2f61c2a4d66fe1c9c0a286c3f9af5ff713 WatchSource:0}: Error finding container d6af3d8eeca7301c5ad7ee128fc44b2f61c2a4d66fe1c9c0a286c3f9af5ff713: Status 404 returned error can't find the container with id d6af3d8eeca7301c5ad7ee128fc44b2f61c2a4d66fe1c9c0a286c3f9af5ff713 Apr 22 19:53:15.537415 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.537396 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:53:15.761132 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:15.761075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/must-gather-jzsx9" event={"ID":"b47aa9e1-6008-46eb-908b-3475509552f0","Type":"ContainerStarted","Data":"d6af3d8eeca7301c5ad7ee128fc44b2f61c2a4d66fe1c9c0a286c3f9af5ff713"} Apr 22 19:53:16.767574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:16.767522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/must-gather-jzsx9" event={"ID":"b47aa9e1-6008-46eb-908b-3475509552f0","Type":"ContainerStarted","Data":"33b4c3c5266945f5ee911ce30a9af80dde97de0b0079b77a5a97130a3ac6e586"} Apr 22 19:53:16.767574 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:16.767576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/must-gather-jzsx9" event={"ID":"b47aa9e1-6008-46eb-908b-3475509552f0","Type":"ContainerStarted","Data":"3f58df702c18943601d31abeef0ffacec70c882fd53e3999daebc2d8bc9424f8"} Apr 22 19:53:16.786715 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:16.786652 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x642z/must-gather-jzsx9" podStartSLOduration=0.867058283 podStartE2EDuration="1.786636217s" podCreationTimestamp="2026-04-22 19:53:15 +0000 UTC" firstStartedPulling="2026-04-22 19:53:15.53755067 +0000 UTC m=+1768.185565452" lastFinishedPulling="2026-04-22 19:53:16.4571286 +0000 UTC m=+1769.105143386" observedRunningTime="2026-04-22 19:53:16.784702279 +0000 UTC m=+1769.432717086" watchObservedRunningTime="2026-04-22 19:53:16.786636217 +0000 UTC m=+1769.434651019" Apr 22 19:53:18.056082 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:18.056052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jwtj4_a3e9c485-cff6-44ce-b842-b27605d809bb/global-pull-secret-syncer/0.log" Apr 22 19:53:18.135888 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:18.135828 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pjlfc_81956a4f-380b-43d9-919e-60fbb787f267/konnectivity-agent/0.log" Apr 22 19:53:18.192505 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:18.192468 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-242.ec2.internal_a6923805a814b270020f7b819e6da6c2/haproxy/0.log" Apr 22 19:53:22.093585 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:22.093538 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-txgjc_5d7e87a7-8512-4326-8ce5-2e37c4dc83bf/manager/0.log" Apr 22 19:53:22.205312 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:22.205275 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-tgt6j_2be03193-fbc2-4c07-b0e5-a853c833c1fc/limitador/0.log" Apr 22 19:53:22.233307 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:22.233268 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-c62xk_41c4f928-0361-4948-8ed4-a14000b7e054/manager/0.log" Apr 22 19:53:23.486528 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.486484 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-h826t_3b4d0dd1-0012-4f37-aedf-467520762f8d/monitoring-plugin/0.log" Apr 22 19:53:23.655322 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.655284 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q4gf4_b3a61eaf-ba59-4d3f-97cc-68c70e44c797/node-exporter/0.log" Apr 22 19:53:23.673931 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.673878 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q4gf4_b3a61eaf-ba59-4d3f-97cc-68c70e44c797/kube-rbac-proxy/0.log" Apr 22 19:53:23.691670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.691633 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q4gf4_b3a61eaf-ba59-4d3f-97cc-68c70e44c797/init-textfile/0.log" Apr 22 19:53:23.718264 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.718229 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9cf6k_8f432698-4844-4c25-b51c-849193e9c061/kube-rbac-proxy-main/0.log" Apr 22 19:53:23.735021 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.734961 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9cf6k_8f432698-4844-4c25-b51c-849193e9c061/kube-rbac-proxy-self/0.log" Apr 22 19:53:23.754706 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:23.754627 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9cf6k_8f432698-4844-4c25-b51c-849193e9c061/openshift-state-metrics/0.log" Apr 22 19:53:26.845670 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.845639 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7"] Apr 22 19:53:26.851860 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.851835 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:26.858570 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.858543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7"] Apr 22 19:53:26.900054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.899857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-proc\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:26.900054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.899943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-sys\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:26.900054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.899973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-lib-modules\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:26.900054 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.900037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-podres\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:26.900462 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:26.900073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qz7d\" (UniqueName: \"kubernetes.io/projected/5b60f0c1-761a-47c2-be72-d41729e7e859-kube-api-access-4qz7d\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.000800 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.000748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-podres\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.000824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qz7d\" (UniqueName: \"kubernetes.io/projected/5b60f0c1-761a-47c2-be72-d41729e7e859-kube-api-access-4qz7d\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.000899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-proc\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.000958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-sys\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001004 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.000986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-lib-modules\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001366 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.001199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-lib-modules\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001366 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.001306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-podres\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001698 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.001669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-proc\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.001775 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.001741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b60f0c1-761a-47c2-be72-d41729e7e859-sys\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.009996 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.009971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qz7d\" (UniqueName: \"kubernetes.io/projected/5b60f0c1-761a-47c2-be72-d41729e7e859-kube-api-access-4qz7d\") pod \"perf-node-gather-daemonset-grmc7\" (UID: \"5b60f0c1-761a-47c2-be72-d41729e7e859\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.165463 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.165378 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.322265 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.322188 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7"] Apr 22 19:53:27.326120 ip-10-0-140-242 kubenswrapper[2569]: W0422 19:53:27.326073 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b60f0c1_761a_47c2_be72_d41729e7e859.slice/crio-26a4afe63c4bb7b5ea0dff370b671a0332691d38e212b03d3188ce8b5b161114 WatchSource:0}: Error finding container 26a4afe63c4bb7b5ea0dff370b671a0332691d38e212b03d3188ce8b5b161114: Status 404 returned error can't find the container with id 26a4afe63c4bb7b5ea0dff370b671a0332691d38e212b03d3188ce8b5b161114 Apr 22 19:53:27.805348 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.805261 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mmxtx_7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78/dns/0.log" Apr 22 19:53:27.826081 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.826047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mmxtx_7a1ab1b7-70c4-4d99-8e11-7ae79f3b3c78/kube-rbac-proxy/0.log" Apr 22 19:53:27.844801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.844759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" event={"ID":"5b60f0c1-761a-47c2-be72-d41729e7e859","Type":"ContainerStarted","Data":"81004fe7b5e47cbab8aafaaffb24cdde5ae2bdf106176aeead9a336ae347adb5"} Apr 22 19:53:27.844801 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.844794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" event={"ID":"5b60f0c1-761a-47c2-be72-d41729e7e859","Type":"ContainerStarted","Data":"26a4afe63c4bb7b5ea0dff370b671a0332691d38e212b03d3188ce8b5b161114"} Apr 22 19:53:27.845037 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.844913 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:27.856911 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.856883 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rtnc_51e23bda-7f24-43f3-9b0b-9e0f8a95c02f/dns-node-resolver/0.log" Apr 22 19:53:27.876488 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:27.876433 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" podStartSLOduration=1.87641802 podStartE2EDuration="1.87641802s" podCreationTimestamp="2026-04-22 19:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:27.875343923 +0000 UTC m=+1780.523358728" watchObservedRunningTime="2026-04-22 19:53:27.87641802 +0000 UTC m=+1780.524432824" Apr 22 19:53:28.383687 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:28.383660 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6b4cg_3dc70558-ecae-4e50-82a2-3b1c70e5cfb2/node-ca/0.log" Apr 22 19:53:29.246943 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:29.246912 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8q9q2_71df9112-0f4c-45a0-8daa-f289e7cccb4f/istio-proxy/0.log" Apr 22 19:53:29.729369 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:29.729339 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zgpcw_99054ff8-b2bf-49da-9d88-9f03b317fea0/serve-healthcheck-canary/0.log" Apr 22 19:53:30.241959 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:30.241926 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d55df_2a39dd47-7813-45d4-bf4e-249d40368c54/kube-rbac-proxy/0.log" Apr 22 19:53:30.257661 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:30.257630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d55df_2a39dd47-7813-45d4-bf4e-249d40368c54/exporter/0.log" Apr 22 19:53:30.274321 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:30.274292 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d55df_2a39dd47-7813-45d4-bf4e-249d40368c54/extractor/0.log" Apr 22 19:53:33.323477 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.323443 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-545d8995fb-bmn9x_975d2e8d-8fee-4ec8-827c-3fae179595dc/manager/0.log" Apr 22 19:53:33.471599 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.471553 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-768c94fb69-x9cvs_c42cce3c-2847-4703-aa07-1c53dbf3a75f/manager/0.log" Apr 22 19:53:33.488752 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.488724 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tjsmp_595b3874-3bb5-4343-885b-50df17bddd1b/server/0.log" Apr 22 19:53:33.862537 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.862497 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-grmc7" Apr 22 19:53:33.864192 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.864170 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-z28l8_d3e6cab3-ce19-45ec-9758-9e2c0f5d29cd/manager/0.log" Apr 22 19:53:33.881870 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.881839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-nw9l2_9b3ab1b0-053b-458b-bf44-8c266bd8d7cc/s3-init/0.log" Apr 22 19:53:33.908319 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:33.908293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-q8hsv_c777dda1-33cf-445a-91a2-15b066fd5d2e/seaweedfs/0.log" Apr 22 19:53:39.753707 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.753669 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/kube-multus-additional-cni-plugins/0.log" Apr 22 19:53:39.770765 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.770732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/egress-router-binary-copy/0.log" Apr 22 19:53:39.786156 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.786125 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/cni-plugins/0.log" Apr 22 19:53:39.802085 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.802059 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/bond-cni-plugin/0.log" Apr 22 19:53:39.816459 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.816433 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/routeoverride-cni/0.log" Apr 22 19:53:39.832535 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.832510 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/whereabouts-cni-bincopy/0.log" Apr 22 19:53:39.848471 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:39.848442 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x9sbh_bf2d6d69-a4bd-4d9a-b48c-1f85a054c228/whereabouts-cni/0.log" Apr 22 19:53:40.067913 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:40.067811 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h476q_a41b4bf8-7bc3-4be1-bb23-1c56997325bd/kube-multus/0.log" Apr 22 19:53:40.112919 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:40.112884 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dx52z_f4583537-f5a4-4201-a5ba-5c41cf04b3da/network-metrics-daemon/0.log" Apr 22 19:53:40.128809 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:40.128776 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dx52z_f4583537-f5a4-4201-a5ba-5c41cf04b3da/kube-rbac-proxy/0.log" Apr 22 19:53:41.053245 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.053208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-controller/0.log" Apr 22 19:53:41.068146 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.068119 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/0.log" Apr 22 19:53:41.086013 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.085986 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovn-acl-logging/1.log" Apr 22 19:53:41.106944 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.106913 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/kube-rbac-proxy-node/0.log" Apr 22 19:53:41.127292 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.127262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:53:41.141033 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.141000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/northd/0.log" Apr 22 19:53:41.157212 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.157178 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/nbdb/0.log" Apr 22 19:53:41.171977 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.171946 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/sbdb/0.log" Apr 22 19:53:41.368331 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:41.368242 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99kk8_94c9353c-64db-4c45-9df3-30ea8b6efb63/ovnkube-controller/0.log" Apr 22 19:53:42.875525 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:42.875488 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mjd2c_6897e3de-61a5-4d68-9638-35ac613b4f31/network-check-target-container/0.log" Apr 22 19:53:43.847897 ip-10-0-140-242 kubenswrapper[2569]: I0422 19:53:43.847872 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qnv98_319d7fc4-bd09-4f40-bc9c-908e50f344ed/iptables-alerter/0.log"